00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v23.11" build number 351 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3016 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.024 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.024 The recommended git tool is: git 00:00:00.025 using credential 00000000-0000-0000-0000-000000000002 00:00:00.026 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.040 Fetching changes from the remote Git repository 00:00:00.043 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.057 Using shallow fetch with depth 1 00:00:00.057 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.057 > git --version # timeout=10 00:00:00.079 > git --version # 'git version 2.39.2' 00:00:00.079 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.080 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.080 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.309 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.319 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.330 Checking out Revision f964f6d3463483adf05cc5c086f2abd292e05f1d (FETCH_HEAD) 00:00:03.330 > git config core.sparsecheckout # timeout=10 00:00:03.341 > git read-tree -mu HEAD # timeout=10 00:00:03.356 > git checkout -f f964f6d3463483adf05cc5c086f2abd292e05f1d # timeout=5 00:00:03.378 Commit message: "ansible/roles/custom_facts: Drop nvme features" 00:00:03.379 > git rev-list --no-walk f964f6d3463483adf05cc5c086f2abd292e05f1d # timeout=10 00:00:03.588 [Pipeline] Start of Pipeline 00:00:03.602 [Pipeline] library 00:00:03.603 Loading library shm_lib@master 00:00:03.604 Library shm_lib@master is cached. Copying from home. 00:00:03.619 [Pipeline] node 00:00:03.633 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.635 [Pipeline] { 00:00:03.646 [Pipeline] catchError 00:00:03.648 [Pipeline] { 00:00:03.660 [Pipeline] wrap 00:00:03.668 [Pipeline] { 00:00:03.675 [Pipeline] stage 00:00:03.677 [Pipeline] { (Prologue) 00:00:03.853 [Pipeline] sh 00:00:04.136 + logger -p user.info -t JENKINS-CI 00:00:04.158 [Pipeline] echo 00:00:04.160 Node: WFP20 00:00:04.169 [Pipeline] sh 00:00:04.466 [Pipeline] setCustomBuildProperty 00:00:04.478 [Pipeline] echo 00:00:04.479 Cleanup processes 00:00:04.484 [Pipeline] sh 00:00:04.767 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.767 2492193 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.782 [Pipeline] sh 00:00:05.069 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.069 ++ grep -v 'sudo pgrep' 00:00:05.069 ++ awk '{print $1}' 00:00:05.069 + sudo kill -9 00:00:05.069 + true 00:00:05.084 [Pipeline] cleanWs 00:00:05.093 [WS-CLEANUP] Deleting project workspace... 00:00:05.093 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.100 [WS-CLEANUP] done 00:00:05.104 [Pipeline] setCustomBuildProperty 00:00:05.117 [Pipeline] sh 00:00:05.399 + sudo git config --global --replace-all safe.directory '*' 00:00:05.479 [Pipeline] nodesByLabel 00:00:05.480 Found a total of 1 nodes with the 'sorcerer' label 00:00:05.491 [Pipeline] httpRequest 00:00:05.495 HttpMethod: GET 00:00:05.497 URL: http://10.211.164.96/packages/jbp_f964f6d3463483adf05cc5c086f2abd292e05f1d.tar.gz 00:00:05.498 Sending request to url: http://10.211.164.96/packages/jbp_f964f6d3463483adf05cc5c086f2abd292e05f1d.tar.gz 00:00:05.501 Response Code: HTTP/1.1 200 OK 00:00:05.502 Success: Status code 200 is in the accepted range: 200,404 00:00:05.502 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_f964f6d3463483adf05cc5c086f2abd292e05f1d.tar.gz 00:00:05.767 [Pipeline] sh 00:00:06.047 + tar --no-same-owner -xf jbp_f964f6d3463483adf05cc5c086f2abd292e05f1d.tar.gz 00:00:06.066 [Pipeline] httpRequest 00:00:06.071 HttpMethod: GET 00:00:06.071 URL: http://10.211.164.96/packages/spdk_36faa8c312bf9059b86e0f503d7fd6b43c1498e6.tar.gz 00:00:06.072 Sending request to url: http://10.211.164.96/packages/spdk_36faa8c312bf9059b86e0f503d7fd6b43c1498e6.tar.gz 00:00:06.074 Response Code: HTTP/1.1 200 OK 00:00:06.074 Success: Status code 200 is in the accepted range: 200,404 00:00:06.075 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_36faa8c312bf9059b86e0f503d7fd6b43c1498e6.tar.gz 00:00:19.210 [Pipeline] sh 00:00:19.490 + tar --no-same-owner -xf spdk_36faa8c312bf9059b86e0f503d7fd6b43c1498e6.tar.gz 00:00:22.038 [Pipeline] sh 00:00:22.320 + git -C spdk log --oneline -n5 00:00:22.320 36faa8c31 bdev/nvme: Fix the case that namespace was removed during reset 00:00:22.320 e2cb5a5ee bdev/nvme: Factor out nvme_ns active/inactive check into a helper function 00:00:22.320 4b134b4ab bdev/nvme: Delay callbacks when the next operation is a failover 00:00:22.320 d2ea4ecb1 llvm/vfio: Suppress checking leaks for `spdk_nvme_ctrlr_alloc_io_qpair` 00:00:22.320 3b33f4333 test/nvme/cuse: Fix typo 00:00:22.339 [Pipeline] withCredentials 00:00:22.349 > git --version # timeout=10 00:00:22.360 > git --version # 'git version 2.39.2' 00:00:22.376 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:22.378 [Pipeline] { 00:00:22.387 [Pipeline] retry 00:00:22.388 [Pipeline] { 00:00:22.401 [Pipeline] sh 00:00:22.682 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:00:22.954 [Pipeline] } 00:00:22.974 [Pipeline] // retry 00:00:22.978 [Pipeline] } 00:00:22.997 [Pipeline] // withCredentials 00:00:23.007 [Pipeline] httpRequest 00:00:23.011 HttpMethod: GET 00:00:23.011 URL: http://10.211.164.96/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:23.012 Sending request to url: http://10.211.164.96/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:23.038 Response Code: HTTP/1.1 200 OK 00:00:23.038 Success: Status code 200 is in the accepted range: 200,404 00:00:23.039 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:18.619 [Pipeline] sh 00:01:18.906 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:20.297 [Pipeline] sh 00:01:20.578 + git -C dpdk log --oneline -n5 00:01:20.578 eeb0605f11 version: 23.11.0 00:01:20.578 238778122a doc: update release notes for 23.11 00:01:20.578 46aa6b3cfc doc: fix description of RSS features 00:01:20.578 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:20.578 7e421ae345 devtools: support skipping forbid rule check 00:01:20.589 [Pipeline] } 00:01:20.606 [Pipeline] // stage 00:01:20.615 [Pipeline] stage 00:01:20.617 [Pipeline] { (Prepare) 00:01:20.634 [Pipeline] writeFile 00:01:20.646 [Pipeline] sh 00:01:20.926 + logger -p user.info -t JENKINS-CI 00:01:20.937 [Pipeline] sh 00:01:21.216 + logger -p user.info -t JENKINS-CI 00:01:21.228 [Pipeline] sh 00:01:21.512 + cat autorun-spdk.conf 00:01:21.512 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:21.512 SPDK_RUN_UBSAN=1 00:01:21.512 SPDK_TEST_FUZZER=1 00:01:21.512 SPDK_TEST_FUZZER_SHORT=1 00:01:21.512 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:21.512 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:21.519 RUN_NIGHTLY=1 00:01:21.523 [Pipeline] readFile 00:01:21.538 [Pipeline] withEnv 00:01:21.540 [Pipeline] { 00:01:21.550 [Pipeline] sh 00:01:21.834 + set -ex 00:01:21.834 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:21.834 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:21.834 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:21.834 ++ SPDK_RUN_UBSAN=1 00:01:21.834 ++ SPDK_TEST_FUZZER=1 00:01:21.834 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:21.834 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:21.834 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:21.834 ++ RUN_NIGHTLY=1 00:01:21.834 + case $SPDK_TEST_NVMF_NICS in 00:01:21.834 + DRIVERS= 00:01:21.834 + [[ -n '' ]] 00:01:21.834 + exit 0 00:01:21.844 [Pipeline] } 00:01:21.861 [Pipeline] // withEnv 00:01:21.866 [Pipeline] } 00:01:21.883 [Pipeline] // stage 00:01:21.892 [Pipeline] catchError 00:01:21.893 [Pipeline] { 00:01:21.908 [Pipeline] timeout 00:01:21.908 Timeout set to expire in 30 min 00:01:21.910 [Pipeline] { 00:01:21.926 [Pipeline] stage 00:01:21.928 [Pipeline] { (Tests) 00:01:21.946 [Pipeline] sh 00:01:22.249 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:22.249 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:22.249 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:22.249 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:22.249 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:22.249 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:22.249 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:22.249 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:22.249 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:22.249 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:22.249 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:22.249 + source /etc/os-release 00:01:22.249 ++ NAME='Fedora Linux' 00:01:22.249 ++ VERSION='38 (Cloud Edition)' 00:01:22.249 ++ ID=fedora 00:01:22.249 ++ VERSION_ID=38 00:01:22.249 ++ VERSION_CODENAME= 00:01:22.249 ++ PLATFORM_ID=platform:f38 00:01:22.249 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:22.249 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:22.249 ++ LOGO=fedora-logo-icon 00:01:22.249 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:22.249 ++ HOME_URL=https://fedoraproject.org/ 00:01:22.249 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:22.249 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:22.249 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:22.249 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:22.249 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:22.249 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:22.249 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:22.249 ++ SUPPORT_END=2024-05-14 00:01:22.249 ++ VARIANT='Cloud Edition' 00:01:22.249 ++ VARIANT_ID=cloud 00:01:22.249 + uname -a 00:01:22.249 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:22.249 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:24.794 Hugepages 00:01:24.794 node hugesize free / total 00:01:24.794 node0 1048576kB 0 / 0 00:01:24.794 node0 2048kB 0 / 0 00:01:24.794 node1 1048576kB 0 / 0 00:01:24.794 node1 2048kB 0 / 0 00:01:24.794 00:01:24.794 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:24.794 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:24.794 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:24.794 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:24.794 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:24.794 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:24.794 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:24.794 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:24.794 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:25.053 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:25.053 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:25.053 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:25.053 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:25.053 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:25.053 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:25.053 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:25.053 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:25.053 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:25.053 + rm -f /tmp/spdk-ld-path 00:01:25.053 + source autorun-spdk.conf 00:01:25.053 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:25.053 ++ SPDK_RUN_UBSAN=1 00:01:25.053 ++ SPDK_TEST_FUZZER=1 00:01:25.053 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:25.053 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:25.053 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:25.053 ++ RUN_NIGHTLY=1 00:01:25.053 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:25.053 + [[ -n '' ]] 00:01:25.053 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:25.053 + for M in /var/spdk/build-*-manifest.txt 00:01:25.053 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:25.053 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:25.053 + for M in /var/spdk/build-*-manifest.txt 00:01:25.053 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:25.053 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:25.053 ++ uname 00:01:25.053 + [[ Linux == \L\i\n\u\x ]] 00:01:25.053 + sudo dmesg -T 00:01:25.053 + sudo dmesg --clear 00:01:25.053 + dmesg_pid=2493099 00:01:25.053 + [[ Fedora Linux == FreeBSD ]] 00:01:25.053 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:25.053 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:25.053 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:25.053 + [[ -x /usr/src/fio-static/fio ]] 00:01:25.053 + export FIO_BIN=/usr/src/fio-static/fio 00:01:25.053 + FIO_BIN=/usr/src/fio-static/fio 00:01:25.053 + sudo dmesg -Tw 00:01:25.053 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:25.053 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:25.053 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:25.053 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:25.053 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:25.053 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:25.053 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:25.053 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:25.053 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:25.313 Test configuration: 00:01:25.313 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:25.313 SPDK_RUN_UBSAN=1 00:01:25.313 SPDK_TEST_FUZZER=1 00:01:25.313 SPDK_TEST_FUZZER_SHORT=1 00:01:25.313 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:25.313 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:25.313 RUN_NIGHTLY=1 06:43:55 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:25.313 06:43:55 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:25.313 06:43:55 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:25.313 06:43:55 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:25.313 06:43:55 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:25.313 06:43:55 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:25.313 06:43:55 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:25.313 06:43:55 -- paths/export.sh@5 -- $ export PATH 00:01:25.313 06:43:55 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:25.313 06:43:55 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:25.313 06:43:55 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:25.313 06:43:55 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1714193035.XXXXXX 00:01:25.313 06:43:55 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1714193035.nMNKdT 00:01:25.313 06:43:55 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:25.313 06:43:55 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:01:25.313 06:43:55 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:25.313 06:43:55 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:25.313 06:43:55 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:25.313 06:43:55 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:25.313 06:43:55 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:25.313 06:43:55 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:25.313 06:43:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:25.313 06:43:55 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:25.313 06:43:55 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:25.313 06:43:55 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:25.313 06:43:55 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:25.313 06:43:55 -- spdk/autobuild.sh@16 -- $ date -u 00:01:25.313 Sat Apr 27 04:43:55 AM UTC 2024 00:01:25.313 06:43:55 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:25.313 LTS-24-g36faa8c31 00:01:25.313 06:43:55 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:25.313 06:43:55 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:25.313 06:43:55 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:25.313 06:43:55 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:25.313 06:43:55 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:25.313 06:43:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:25.313 ************************************ 00:01:25.314 START TEST ubsan 00:01:25.314 ************************************ 00:01:25.314 06:43:55 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:25.314 using ubsan 00:01:25.314 00:01:25.314 real 0m0.000s 00:01:25.314 user 0m0.000s 00:01:25.314 sys 0m0.000s 00:01:25.314 06:43:55 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:25.314 06:43:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:25.314 ************************************ 00:01:25.314 END TEST ubsan 00:01:25.314 ************************************ 00:01:25.314 06:43:55 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:25.314 06:43:55 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:25.314 06:43:55 -- common/autobuild_common.sh@427 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:25.314 06:43:55 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:25.314 06:43:55 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:25.314 06:43:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:25.314 ************************************ 00:01:25.314 START TEST build_native_dpdk 00:01:25.314 ************************************ 00:01:25.314 06:43:55 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:01:25.314 06:43:55 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:25.314 06:43:55 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:25.314 06:43:55 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:25.314 06:43:55 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:25.314 06:43:55 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:25.314 06:43:55 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:25.314 06:43:55 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:25.314 06:43:55 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:25.314 06:43:55 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:25.314 06:43:55 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:25.314 06:43:55 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:25.314 06:43:55 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:25.314 06:43:55 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:25.314 06:43:55 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:25.314 06:43:55 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:25.314 06:43:55 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:25.314 06:43:55 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:25.314 eeb0605f11 version: 23.11.0 00:01:25.314 238778122a doc: update release notes for 23.11 00:01:25.314 46aa6b3cfc doc: fix description of RSS features 00:01:25.314 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:25.314 7e421ae345 devtools: support skipping forbid rule check 00:01:25.314 06:43:55 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:25.314 06:43:55 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:25.314 06:43:55 -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:25.314 06:43:55 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:25.314 06:43:55 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:25.314 06:43:55 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:25.314 06:43:55 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:25.314 06:43:55 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:25.314 06:43:55 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:25.314 06:43:55 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:25.314 06:43:55 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:25.314 06:43:55 -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:01:25.314 06:43:55 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:01:25.314 06:43:55 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:25.314 06:43:55 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:25.314 06:43:55 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:25.314 06:43:55 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:25.314 06:43:55 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:25.314 06:43:55 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:25.314 06:43:55 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:25.314 06:43:55 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:25.314 06:43:55 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:25.314 06:43:55 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:25.314 06:43:55 -- scripts/common.sh@343 -- $ case "$op" in 00:01:25.314 06:43:55 -- scripts/common.sh@344 -- $ : 1 00:01:25.314 06:43:55 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:25.314 06:43:55 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:25.314 06:43:55 -- scripts/common.sh@364 -- $ decimal 23 00:01:25.314 06:43:55 -- scripts/common.sh@352 -- $ local d=23 00:01:25.314 06:43:55 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:25.314 06:43:55 -- scripts/common.sh@354 -- $ echo 23 00:01:25.314 06:43:55 -- scripts/common.sh@364 -- $ ver1[v]=23 00:01:25.314 06:43:55 -- scripts/common.sh@365 -- $ decimal 21 00:01:25.314 06:43:55 -- scripts/common.sh@352 -- $ local d=21 00:01:25.314 06:43:55 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:25.314 06:43:55 -- scripts/common.sh@354 -- $ echo 21 00:01:25.314 06:43:55 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:25.314 06:43:55 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:25.314 06:43:55 -- scripts/common.sh@366 -- $ return 1 00:01:25.314 06:43:55 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:25.573 patching file config/rte_config.h 00:01:25.573 Hunk #1 succeeded at 60 (offset 1 line). 00:01:25.573 06:43:55 -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:01:25.573 06:43:55 -- common/autobuild_common.sh@178 -- $ uname -s 00:01:25.573 06:43:55 -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:01:25.573 06:43:55 -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:25.573 06:43:55 -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:30.850 The Meson build system 00:01:30.850 Version: 1.3.1 00:01:30.850 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:30.850 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:30.850 Build type: native build 00:01:30.850 Program cat found: YES (/usr/bin/cat) 00:01:30.850 Project name: DPDK 00:01:30.850 Project version: 23.11.0 00:01:30.850 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:30.850 C linker for the host machine: gcc ld.bfd 2.39-16 00:01:30.850 Host machine cpu family: x86_64 00:01:30.850 Host machine cpu: x86_64 00:01:30.850 Message: ## Building in Developer Mode ## 00:01:30.850 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:30.850 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:30.850 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:30.850 Program python3 found: YES (/usr/bin/python3) 00:01:30.850 Program cat found: YES (/usr/bin/cat) 00:01:30.850 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:30.850 Compiler for C supports arguments -march=native: YES 00:01:30.850 Checking for size of "void *" : 8 00:01:30.850 Checking for size of "void *" : 8 (cached) 00:01:30.850 Library m found: YES 00:01:30.850 Library numa found: YES 00:01:30.850 Has header "numaif.h" : YES 00:01:30.850 Library fdt found: NO 00:01:30.850 Library execinfo found: NO 00:01:30.850 Has header "execinfo.h" : YES 00:01:30.850 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:30.850 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:30.850 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:30.850 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:30.850 Run-time dependency openssl found: YES 3.0.9 00:01:30.850 Run-time dependency libpcap found: YES 1.10.4 00:01:30.850 Has header "pcap.h" with dependency libpcap: YES 00:01:30.850 Compiler for C supports arguments -Wcast-qual: YES 00:01:30.850 Compiler for C supports arguments -Wdeprecated: YES 00:01:30.850 Compiler for C supports arguments -Wformat: YES 00:01:30.850 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:30.850 Compiler for C supports arguments -Wformat-security: NO 00:01:30.850 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:30.850 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:30.850 Compiler for C supports arguments -Wnested-externs: YES 00:01:30.850 Compiler for C supports arguments -Wold-style-definition: YES 00:01:30.850 Compiler for C supports arguments -Wpointer-arith: YES 00:01:30.850 Compiler for C supports arguments -Wsign-compare: YES 00:01:30.850 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:30.850 Compiler for C supports arguments -Wundef: YES 00:01:30.850 Compiler for C supports arguments -Wwrite-strings: YES 00:01:30.850 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:30.850 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:30.850 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:30.850 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:30.850 Program objdump found: YES (/usr/bin/objdump) 00:01:30.850 Compiler for C supports arguments -mavx512f: YES 00:01:30.850 Checking if "AVX512 checking" compiles: YES 00:01:30.850 Fetching value of define "__SSE4_2__" : 1 00:01:30.850 Fetching value of define "__AES__" : 1 00:01:30.850 Fetching value of define "__AVX__" : 1 00:01:30.850 Fetching value of define "__AVX2__" : 1 00:01:30.850 Fetching value of define "__AVX512BW__" : 1 00:01:30.850 Fetching value of define "__AVX512CD__" : 1 00:01:30.850 Fetching value of define "__AVX512DQ__" : 1 00:01:30.850 Fetching value of define "__AVX512F__" : 1 00:01:30.850 Fetching value of define "__AVX512VL__" : 1 00:01:30.850 Fetching value of define "__PCLMUL__" : 1 00:01:30.850 Fetching value of define "__RDRND__" : 1 00:01:30.850 Fetching value of define "__RDSEED__" : 1 00:01:30.850 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:30.850 Fetching value of define "__znver1__" : (undefined) 00:01:30.850 Fetching value of define "__znver2__" : (undefined) 00:01:30.850 Fetching value of define "__znver3__" : (undefined) 00:01:30.850 Fetching value of define "__znver4__" : (undefined) 00:01:30.850 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:30.850 Message: lib/log: Defining dependency "log" 00:01:30.850 Message: lib/kvargs: Defining dependency "kvargs" 00:01:30.850 Message: lib/telemetry: Defining dependency "telemetry" 00:01:30.850 Checking for function "getentropy" : NO 00:01:30.850 Message: lib/eal: Defining dependency "eal" 00:01:30.850 Message: lib/ring: Defining dependency "ring" 00:01:30.850 Message: lib/rcu: Defining dependency "rcu" 00:01:30.850 Message: lib/mempool: Defining dependency "mempool" 00:01:30.850 Message: lib/mbuf: Defining dependency "mbuf" 00:01:30.850 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:30.850 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:30.850 Compiler for C supports arguments -mpclmul: YES 00:01:30.850 Compiler for C supports arguments -maes: YES 00:01:30.850 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:30.850 Compiler for C supports arguments -mavx512bw: YES 00:01:30.850 Compiler for C supports arguments -mavx512dq: YES 00:01:30.850 Compiler for C supports arguments -mavx512vl: YES 00:01:30.850 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:30.850 Compiler for C supports arguments -mavx2: YES 00:01:30.850 Compiler for C supports arguments -mavx: YES 00:01:30.850 Message: lib/net: Defining dependency "net" 00:01:30.850 Message: lib/meter: Defining dependency "meter" 00:01:30.850 Message: lib/ethdev: Defining dependency "ethdev" 00:01:30.850 Message: lib/pci: Defining dependency "pci" 00:01:30.850 Message: lib/cmdline: Defining dependency "cmdline" 00:01:30.850 Message: lib/metrics: Defining dependency "metrics" 00:01:30.850 Message: lib/hash: Defining dependency "hash" 00:01:30.850 Message: lib/timer: Defining dependency "timer" 00:01:30.850 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:30.850 Message: lib/acl: Defining dependency "acl" 00:01:30.850 Message: lib/bbdev: Defining dependency "bbdev" 00:01:30.850 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:30.850 Run-time dependency libelf found: YES 0.190 00:01:30.850 Message: lib/bpf: Defining dependency "bpf" 00:01:30.850 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:30.850 Message: lib/compressdev: Defining dependency "compressdev" 00:01:30.850 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:30.850 Message: lib/distributor: Defining dependency "distributor" 00:01:30.850 Message: lib/dmadev: Defining dependency "dmadev" 00:01:30.850 Message: lib/efd: Defining dependency "efd" 00:01:30.850 Message: lib/eventdev: Defining dependency "eventdev" 00:01:30.850 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:30.850 Message: lib/gpudev: Defining dependency "gpudev" 00:01:30.850 Message: lib/gro: Defining dependency "gro" 00:01:30.850 Message: lib/gso: Defining dependency "gso" 00:01:30.850 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:30.850 Message: lib/jobstats: Defining dependency "jobstats" 00:01:30.850 Message: lib/latencystats: Defining dependency "latencystats" 00:01:30.850 Message: lib/lpm: Defining dependency "lpm" 00:01:30.850 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:30.850 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:30.850 Message: lib/member: Defining dependency "member" 00:01:30.850 Message: lib/pcapng: Defining dependency "pcapng" 00:01:30.850 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:30.850 Message: lib/power: Defining dependency "power" 00:01:30.850 Message: lib/rawdev: Defining dependency "rawdev" 00:01:30.850 Message: lib/regexdev: Defining dependency "regexdev" 00:01:30.850 Message: lib/mldev: Defining dependency "mldev" 00:01:30.850 Message: lib/rib: Defining dependency "rib" 00:01:30.850 Message: lib/reorder: Defining dependency "reorder" 00:01:30.850 Message: lib/sched: Defining dependency "sched" 00:01:30.850 Message: lib/security: Defining dependency "security" 00:01:30.850 Message: lib/stack: Defining dependency "stack" 00:01:30.850 Has header "linux/userfaultfd.h" : YES 00:01:30.850 Has header "linux/vduse.h" : YES 00:01:30.850 Message: lib/vhost: Defining dependency "vhost" 00:01:30.850 Message: lib/ipsec: Defining dependency "ipsec" 00:01:30.850 Message: lib/pdcp: Defining dependency "pdcp" 00:01:30.850 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:30.850 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:30.851 Message: lib/fib: Defining dependency "fib" 00:01:30.851 Message: lib/port: Defining dependency "port" 00:01:30.851 Message: lib/pdump: Defining dependency "pdump" 00:01:30.851 Message: lib/table: Defining dependency "table" 00:01:30.851 Message: lib/pipeline: Defining dependency "pipeline" 00:01:30.851 Message: lib/graph: Defining dependency "graph" 00:01:30.851 Message: lib/node: Defining dependency "node" 00:01:30.851 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:31.447 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:31.447 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:31.447 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:31.447 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:31.447 Compiler for C supports arguments -Wno-unused-value: YES 00:01:31.447 Compiler for C supports arguments -Wno-format: YES 00:01:31.447 Compiler for C supports arguments -Wno-format-security: YES 00:01:31.447 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:31.447 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:31.447 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:31.447 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:31.447 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:31.447 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:31.447 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:31.447 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:31.447 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:31.447 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:31.447 Has header "sys/epoll.h" : YES 00:01:31.447 Program doxygen found: YES (/usr/bin/doxygen) 00:01:31.447 Configuring doxy-api-html.conf using configuration 00:01:31.447 Configuring doxy-api-man.conf using configuration 00:01:31.447 Program mandb found: YES (/usr/bin/mandb) 00:01:31.447 Program sphinx-build found: NO 00:01:31.447 Configuring rte_build_config.h using configuration 00:01:31.447 Message: 00:01:31.447 ================= 00:01:31.447 Applications Enabled 00:01:31.447 ================= 00:01:31.447 00:01:31.447 apps: 00:01:31.447 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:31.447 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:31.447 test-pmd, test-regex, test-sad, test-security-perf, 00:01:31.447 00:01:31.447 Message: 00:01:31.447 ================= 00:01:31.447 Libraries Enabled 00:01:31.447 ================= 00:01:31.447 00:01:31.447 libs: 00:01:31.447 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:31.447 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:01:31.447 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:01:31.447 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:01:31.447 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:01:31.447 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:01:31.447 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:01:31.447 00:01:31.447 00:01:31.447 Message: 00:01:31.447 =============== 00:01:31.447 Drivers Enabled 00:01:31.447 =============== 00:01:31.447 00:01:31.447 common: 00:01:31.447 00:01:31.447 bus: 00:01:31.447 pci, vdev, 00:01:31.447 mempool: 00:01:31.447 ring, 00:01:31.447 dma: 00:01:31.447 00:01:31.447 net: 00:01:31.447 i40e, 00:01:31.447 raw: 00:01:31.447 00:01:31.447 crypto: 00:01:31.447 00:01:31.447 compress: 00:01:31.447 00:01:31.447 regex: 00:01:31.447 00:01:31.447 ml: 00:01:31.447 00:01:31.447 vdpa: 00:01:31.447 00:01:31.447 event: 00:01:31.447 00:01:31.447 baseband: 00:01:31.447 00:01:31.447 gpu: 00:01:31.447 00:01:31.447 00:01:31.447 Message: 00:01:31.447 ================= 00:01:31.447 Content Skipped 00:01:31.447 ================= 00:01:31.447 00:01:31.447 apps: 00:01:31.447 00:01:31.447 libs: 00:01:31.447 00:01:31.447 drivers: 00:01:31.447 common/cpt: not in enabled drivers build config 00:01:31.447 common/dpaax: not in enabled drivers build config 00:01:31.447 common/iavf: not in enabled drivers build config 00:01:31.447 common/idpf: not in enabled drivers build config 00:01:31.447 common/mvep: not in enabled drivers build config 00:01:31.447 common/octeontx: not in enabled drivers build config 00:01:31.447 bus/auxiliary: not in enabled drivers build config 00:01:31.447 bus/cdx: not in enabled drivers build config 00:01:31.447 bus/dpaa: not in enabled drivers build config 00:01:31.447 bus/fslmc: not in enabled drivers build config 00:01:31.447 bus/ifpga: not in enabled drivers build config 00:01:31.447 bus/platform: not in enabled drivers build config 00:01:31.447 bus/vmbus: not in enabled drivers build config 00:01:31.447 common/cnxk: not in enabled drivers build config 00:01:31.447 common/mlx5: not in enabled drivers build config 00:01:31.447 common/nfp: not in enabled drivers build config 00:01:31.447 common/qat: not in enabled drivers build config 00:01:31.447 common/sfc_efx: not in enabled drivers build config 00:01:31.447 mempool/bucket: not in enabled drivers build config 00:01:31.447 mempool/cnxk: not in enabled drivers build config 00:01:31.447 mempool/dpaa: not in enabled drivers build config 00:01:31.447 mempool/dpaa2: not in enabled drivers build config 00:01:31.447 mempool/octeontx: not in enabled drivers build config 00:01:31.447 mempool/stack: not in enabled drivers build config 00:01:31.447 dma/cnxk: not in enabled drivers build config 00:01:31.447 dma/dpaa: not in enabled drivers build config 00:01:31.447 dma/dpaa2: not in enabled drivers build config 00:01:31.447 dma/hisilicon: not in enabled drivers build config 00:01:31.447 dma/idxd: not in enabled drivers build config 00:01:31.447 dma/ioat: not in enabled drivers build config 00:01:31.447 dma/skeleton: not in enabled drivers build config 00:01:31.447 net/af_packet: not in enabled drivers build config 00:01:31.447 net/af_xdp: not in enabled drivers build config 00:01:31.447 net/ark: not in enabled drivers build config 00:01:31.447 net/atlantic: not in enabled drivers build config 00:01:31.447 net/avp: not in enabled drivers build config 00:01:31.448 net/axgbe: not in enabled drivers build config 00:01:31.448 net/bnx2x: not in enabled drivers build config 00:01:31.448 net/bnxt: not in enabled drivers build config 00:01:31.448 net/bonding: not in enabled drivers build config 00:01:31.448 net/cnxk: not in enabled drivers build config 00:01:31.448 net/cpfl: not in enabled drivers build config 00:01:31.448 net/cxgbe: not in enabled drivers build config 00:01:31.448 net/dpaa: not in enabled drivers build config 00:01:31.448 net/dpaa2: not in enabled drivers build config 00:01:31.448 net/e1000: not in enabled drivers build config 00:01:31.448 net/ena: not in enabled drivers build config 00:01:31.448 net/enetc: not in enabled drivers build config 00:01:31.448 net/enetfec: not in enabled drivers build config 00:01:31.448 net/enic: not in enabled drivers build config 00:01:31.448 net/failsafe: not in enabled drivers build config 00:01:31.448 net/fm10k: not in enabled drivers build config 00:01:31.448 net/gve: not in enabled drivers build config 00:01:31.448 net/hinic: not in enabled drivers build config 00:01:31.448 net/hns3: not in enabled drivers build config 00:01:31.448 net/iavf: not in enabled drivers build config 00:01:31.448 net/ice: not in enabled drivers build config 00:01:31.448 net/idpf: not in enabled drivers build config 00:01:31.448 net/igc: not in enabled drivers build config 00:01:31.448 net/ionic: not in enabled drivers build config 00:01:31.448 net/ipn3ke: not in enabled drivers build config 00:01:31.448 net/ixgbe: not in enabled drivers build config 00:01:31.448 net/mana: not in enabled drivers build config 00:01:31.448 net/memif: not in enabled drivers build config 00:01:31.448 net/mlx4: not in enabled drivers build config 00:01:31.448 net/mlx5: not in enabled drivers build config 00:01:31.448 net/mvneta: not in enabled drivers build config 00:01:31.448 net/mvpp2: not in enabled drivers build config 00:01:31.448 net/netvsc: not in enabled drivers build config 00:01:31.448 net/nfb: not in enabled drivers build config 00:01:31.448 net/nfp: not in enabled drivers build config 00:01:31.448 net/ngbe: not in enabled drivers build config 00:01:31.448 net/null: not in enabled drivers build config 00:01:31.448 net/octeontx: not in enabled drivers build config 00:01:31.448 net/octeon_ep: not in enabled drivers build config 00:01:31.448 net/pcap: not in enabled drivers build config 00:01:31.448 net/pfe: not in enabled drivers build config 00:01:31.448 net/qede: not in enabled drivers build config 00:01:31.448 net/ring: not in enabled drivers build config 00:01:31.448 net/sfc: not in enabled drivers build config 00:01:31.448 net/softnic: not in enabled drivers build config 00:01:31.448 net/tap: not in enabled drivers build config 00:01:31.448 net/thunderx: not in enabled drivers build config 00:01:31.448 net/txgbe: not in enabled drivers build config 00:01:31.448 net/vdev_netvsc: not in enabled drivers build config 00:01:31.448 net/vhost: not in enabled drivers build config 00:01:31.448 net/virtio: not in enabled drivers build config 00:01:31.448 net/vmxnet3: not in enabled drivers build config 00:01:31.448 raw/cnxk_bphy: not in enabled drivers build config 00:01:31.448 raw/cnxk_gpio: not in enabled drivers build config 00:01:31.448 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:31.448 raw/ifpga: not in enabled drivers build config 00:01:31.448 raw/ntb: not in enabled drivers build config 00:01:31.448 raw/skeleton: not in enabled drivers build config 00:01:31.448 crypto/armv8: not in enabled drivers build config 00:01:31.448 crypto/bcmfs: not in enabled drivers build config 00:01:31.448 crypto/caam_jr: not in enabled drivers build config 00:01:31.448 crypto/ccp: not in enabled drivers build config 00:01:31.448 crypto/cnxk: not in enabled drivers build config 00:01:31.448 crypto/dpaa_sec: not in enabled drivers build config 00:01:31.448 crypto/dpaa2_sec: not in enabled drivers build config 00:01:31.448 crypto/ipsec_mb: not in enabled drivers build config 00:01:31.448 crypto/mlx5: not in enabled drivers build config 00:01:31.448 crypto/mvsam: not in enabled drivers build config 00:01:31.448 crypto/nitrox: not in enabled drivers build config 00:01:31.448 crypto/null: not in enabled drivers build config 00:01:31.448 crypto/octeontx: not in enabled drivers build config 00:01:31.448 crypto/openssl: not in enabled drivers build config 00:01:31.448 crypto/scheduler: not in enabled drivers build config 00:01:31.448 crypto/uadk: not in enabled drivers build config 00:01:31.448 crypto/virtio: not in enabled drivers build config 00:01:31.448 compress/isal: not in enabled drivers build config 00:01:31.448 compress/mlx5: not in enabled drivers build config 00:01:31.448 compress/octeontx: not in enabled drivers build config 00:01:31.448 compress/zlib: not in enabled drivers build config 00:01:31.448 regex/mlx5: not in enabled drivers build config 00:01:31.448 regex/cn9k: not in enabled drivers build config 00:01:31.448 ml/cnxk: not in enabled drivers build config 00:01:31.448 vdpa/ifc: not in enabled drivers build config 00:01:31.448 vdpa/mlx5: not in enabled drivers build config 00:01:31.448 vdpa/nfp: not in enabled drivers build config 00:01:31.448 vdpa/sfc: not in enabled drivers build config 00:01:31.448 event/cnxk: not in enabled drivers build config 00:01:31.448 event/dlb2: not in enabled drivers build config 00:01:31.448 event/dpaa: not in enabled drivers build config 00:01:31.448 event/dpaa2: not in enabled drivers build config 00:01:31.448 event/dsw: not in enabled drivers build config 00:01:31.448 event/opdl: not in enabled drivers build config 00:01:31.448 event/skeleton: not in enabled drivers build config 00:01:31.448 event/sw: not in enabled drivers build config 00:01:31.448 event/octeontx: not in enabled drivers build config 00:01:31.448 baseband/acc: not in enabled drivers build config 00:01:31.448 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:31.448 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:31.448 baseband/la12xx: not in enabled drivers build config 00:01:31.448 baseband/null: not in enabled drivers build config 00:01:31.448 baseband/turbo_sw: not in enabled drivers build config 00:01:31.448 gpu/cuda: not in enabled drivers build config 00:01:31.448 00:01:31.448 00:01:31.448 Build targets in project: 217 00:01:31.448 00:01:31.448 DPDK 23.11.0 00:01:31.448 00:01:31.448 User defined options 00:01:31.448 libdir : lib 00:01:31.448 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:31.448 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:31.448 c_link_args : 00:01:31.448 enable_docs : false 00:01:31.448 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:31.448 enable_kmods : false 00:01:31.448 machine : native 00:01:31.448 tests : false 00:01:31.448 00:01:31.448 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:31.448 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:31.448 06:44:01 -- common/autobuild_common.sh@186 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:31.448 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:31.448 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:31.752 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:31.752 [3/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:31.752 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:31.752 [5/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:31.752 [6/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:31.752 [7/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:31.752 [8/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:31.752 [9/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:31.752 [10/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:31.752 [11/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:31.752 [12/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:31.752 [13/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:31.752 [14/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:31.752 [15/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:31.752 [16/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:31.752 [17/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:31.752 [18/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:31.752 [19/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:31.752 [20/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:31.752 [21/707] Linking static target lib/librte_kvargs.a 00:01:31.752 [22/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:31.752 [23/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:31.752 [24/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:31.752 [25/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:31.752 [26/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:31.752 [27/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:31.752 [28/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:31.752 [29/707] Linking static target lib/librte_pci.a 00:01:31.752 [30/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:31.752 [31/707] Linking static target lib/librte_log.a 00:01:31.752 [32/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:32.020 [33/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:32.020 [34/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:32.020 [35/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:32.020 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:32.020 [37/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.020 [38/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:32.020 [39/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.286 [40/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:32.286 [41/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:32.286 [42/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:32.286 [43/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:32.286 [44/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:32.286 [45/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:32.286 [46/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:32.286 [47/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:32.286 [48/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:32.286 [49/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:32.286 [50/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:32.286 [51/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:32.286 [52/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:32.286 [53/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:32.286 [54/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:32.286 [55/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:32.286 [56/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:32.286 [57/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:32.286 [58/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:32.286 [59/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:32.286 [60/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:32.286 [61/707] Linking static target lib/librte_meter.a 00:01:32.286 [62/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:32.286 [63/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:32.286 [64/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:32.286 [65/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:32.286 [66/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:32.286 [67/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:32.286 [68/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:32.286 [69/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:32.286 [70/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:32.286 [71/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:32.286 [72/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:32.286 [73/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:32.286 [74/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:32.286 [75/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:32.286 [76/707] Linking static target lib/librte_ring.a 00:01:32.286 [77/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:32.286 [78/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:32.286 [79/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:32.286 [80/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:32.286 [81/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:32.286 [82/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:32.286 [83/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:32.286 [84/707] Linking static target lib/librte_cmdline.a 00:01:32.546 [85/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:32.546 [86/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:32.546 [87/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:32.546 [88/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:32.546 [89/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:32.546 [90/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:32.546 [91/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:32.546 [92/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:32.546 [93/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:32.546 [94/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:32.546 [95/707] Linking static target lib/librte_metrics.a 00:01:32.546 [96/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:32.546 [97/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:32.546 [98/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:32.546 [99/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:32.546 [100/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:32.546 [101/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:32.546 [102/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:32.546 [103/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:32.546 [104/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:32.546 [105/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:32.546 [106/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:32.546 [107/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:32.546 [108/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:32.546 [109/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:32.546 [110/707] Linking static target lib/librte_net.a 00:01:32.547 [111/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:32.547 [112/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:32.547 [113/707] Linking static target lib/librte_cfgfile.a 00:01:32.547 [114/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:32.547 [115/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:32.547 [116/707] Linking static target lib/librte_bitratestats.a 00:01:32.547 [117/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:32.547 [118/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:32.547 [119/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.547 [120/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:32.547 [121/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:32.547 [122/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:32.547 [123/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:32.547 [124/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:32.547 [125/707] Linking target lib/librte_log.so.24.0 00:01:32.809 [126/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:32.809 [127/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:32.809 [128/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:32.809 [129/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.809 [130/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:32.809 [131/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:32.809 [132/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:32.809 [133/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:32.809 [134/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:32.809 [135/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:32.809 [136/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.809 [137/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:32.809 [138/707] Linking static target lib/librte_timer.a 00:01:32.809 [139/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:32.809 [140/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:32.809 [141/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:32.809 [142/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:32.809 [143/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:32.809 [144/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:32.809 [145/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:32.809 [146/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.809 [147/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:32.809 [148/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:32.809 [149/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:32.809 [150/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:32.809 [151/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.809 [152/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:33.072 [153/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:33.072 [154/707] Linking target lib/librte_kvargs.so.24.0 00:01:33.072 [155/707] Linking static target lib/librte_mempool.a 00:01:33.072 [156/707] Linking static target lib/librte_bbdev.a 00:01:33.073 [157/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:33.073 [158/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:33.073 [159/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:33.073 [160/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:33.073 [161/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.073 [162/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:33.073 [163/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:33.073 [164/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:33.073 [165/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:33.073 [166/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:33.073 [167/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:33.073 [168/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:33.073 [169/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:33.073 [170/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:33.073 [171/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:33.073 [172/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:33.073 [173/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.073 [174/707] Linking static target lib/librte_compressdev.a 00:01:33.073 [175/707] Linking static target lib/librte_jobstats.a 00:01:33.073 [176/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:33.073 [177/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:33.073 [178/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:33.073 [179/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:33.073 [180/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:33.073 [181/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:33.073 [182/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:33.073 [183/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:33.073 [184/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:33.073 [185/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:33.073 [186/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:33.335 [187/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:33.335 [188/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:33.335 [189/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:33.335 [190/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:33.335 [191/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:33.335 [192/707] Linking static target lib/librte_dispatcher.a 00:01:33.336 [193/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:33.336 [194/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:33.336 [195/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:33.336 [196/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:33.336 [197/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:33.336 [198/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:33.336 [199/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:33.336 [200/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:33.336 [201/707] Linking static target lib/librte_latencystats.a 00:01:33.336 [202/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:33.336 [203/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:33.336 [204/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:33.336 [205/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:33.336 [206/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:33.336 [207/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:33.336 [208/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:33.336 [209/707] Linking static target lib/librte_telemetry.a 00:01:33.336 [210/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:33.336 [211/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:33.336 [212/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:33.336 [213/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:33.336 [214/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.336 [215/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:33.336 [216/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:33.336 [217/707] Linking static target lib/librte_rcu.a 00:01:33.336 [218/707] Linking static target lib/librte_gpudev.a 00:01:33.336 [219/707] Linking static target lib/librte_stack.a 00:01:33.336 [220/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:33.336 [221/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:33.336 [222/707] Linking static target lib/librte_dmadev.a 00:01:33.336 [223/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:33.336 [224/707] Linking static target lib/librte_eal.a 00:01:33.336 [225/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:33.336 [226/707] Linking static target lib/librte_gro.a 00:01:33.336 [227/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:33.336 [228/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:33.336 [229/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:33.336 [230/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:33.336 [231/707] Linking static target lib/librte_regexdev.a 00:01:33.599 [232/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:33.599 [233/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:33.599 [234/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:33.599 [235/707] Linking static target lib/librte_gso.a 00:01:33.599 [236/707] Linking static target lib/librte_distributor.a 00:01:33.599 [237/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:33.599 [238/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:33.599 [239/707] Linking static target lib/librte_mldev.a 00:01:33.599 [240/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:33.599 [241/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:33.599 [242/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:33.599 [243/707] Linking static target lib/librte_rawdev.a 00:01:33.599 [244/707] Linking static target lib/librte_power.a 00:01:33.599 [245/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.599 [246/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:33.599 [247/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:33.599 [248/707] Linking static target lib/librte_ip_frag.a 00:01:33.599 [249/707] Linking static target lib/librte_mbuf.a 00:01:33.599 [250/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:33.599 [251/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:33.599 [252/707] Linking static target lib/librte_pcapng.a 00:01:33.599 [253/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:33.599 [254/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:33.599 [255/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:33.599 [256/707] Linking static target lib/librte_reorder.a 00:01:33.599 [257/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:33.599 [258/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:33.599 [259/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.599 [260/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.599 [261/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:33.861 [262/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:33.861 [263/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.861 [264/707] Linking static target lib/librte_bpf.a 00:01:33.861 [265/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:33.861 [266/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:33.861 [267/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.861 [268/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:33.861 [269/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:33.861 [270/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.861 [271/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:33.861 [272/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:33.861 [273/707] Linking static target lib/librte_security.a 00:01:33.861 [274/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.861 [275/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:33.861 [276/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:33.861 [277/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:33.861 [278/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.861 [279/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:33.861 [280/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:33.861 [281/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:33.861 [282/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:33.861 [283/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:33.861 [284/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.861 [285/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.861 [286/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:34.119 [287/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:34.119 [288/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.119 [289/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.119 [290/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:34.119 [291/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:34.119 [292/707] Linking static target lib/librte_lpm.a 00:01:34.119 [293/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.119 [294/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:34.119 [295/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:34.119 [296/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.119 [297/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:34.119 [298/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.119 [299/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.119 [300/707] Linking static target lib/librte_rib.a 00:01:34.119 [301/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:34.119 [302/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:34.119 [303/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:34.119 [304/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:34.119 [305/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:34.119 [306/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:34.119 [307/707] Linking target lib/librte_telemetry.so.24.0 00:01:34.119 [308/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:34.119 [309/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:34.119 [310/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:34.119 [311/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:34.119 [312/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:34.119 [313/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:34.386 [314/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:34.386 [315/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:34.386 [316/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.386 [317/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.386 [318/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:34.386 [319/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:34.386 [320/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.386 [321/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:34.386 [322/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:34.386 [323/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:34.386 [324/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:34.386 [325/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:34.386 [326/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:34.386 [327/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:34.386 [328/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:34.386 [329/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:34.386 [330/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:34.386 [331/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:34.386 [332/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:34.386 [333/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:34.386 [334/707] Linking static target lib/librte_efd.a 00:01:34.386 [335/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:34.386 [336/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:34.386 [337/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.386 [338/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:34.386 [339/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:34.651 [340/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:34.651 [341/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:34.651 [342/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:34.651 [343/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:34.651 [344/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:34.651 [345/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:34.651 [346/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:34.651 [347/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.651 [348/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:34.651 [349/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:34.651 [350/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:34.651 [351/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:34.651 [352/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:34.651 [353/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.651 [354/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.651 [355/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.651 [356/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:34.651 [357/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:34.651 [358/707] Linking static target lib/librte_fib.a 00:01:34.651 [359/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:34.651 [360/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:34.651 [361/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:34.651 [362/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:34.921 [363/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:34.921 [364/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:34.921 [365/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:34.921 [366/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:34.921 [367/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:34.921 [368/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:34.921 [369/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.921 [370/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.921 [371/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:34.921 [372/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.921 [373/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:34.921 [374/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:34.921 [375/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:34.921 [376/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:34.921 [377/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:34.921 [378/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:34.921 [379/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:34.921 [380/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:34.921 [381/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:34.921 [382/707] Linking static target lib/librte_graph.a 00:01:34.921 [383/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:34.921 [384/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:34.921 [385/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:34.921 [386/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:34.921 [387/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:35.185 [388/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:35.185 [389/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:35.185 [390/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:35.185 [391/707] Linking static target lib/librte_pdump.a 00:01:35.185 [392/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:35.185 [393/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:35.186 [394/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:35.186 [395/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:35.186 [396/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:35.186 [397/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:35.186 [398/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:35.186 [399/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:35.186 [400/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:35.186 [401/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:35.186 [402/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:35.186 [403/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:35.186 [404/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:35.186 [405/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:35.186 [406/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:35.186 [407/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:35.186 [408/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:35.186 [409/707] Linking static target lib/librte_table.a 00:01:35.186 [410/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:35.186 [411/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:35.186 [412/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:35.186 [413/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:35.186 [414/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:35.186 [415/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.186 [416/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:35.186 [417/707] Linking static target drivers/librte_bus_vdev.a 00:01:35.450 [418/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:35.450 [419/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:35.450 [420/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:35.450 [421/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:35.450 [422/707] Linking static target lib/librte_sched.a 00:01:35.450 [423/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:35.450 [424/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:35.450 [425/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:35.450 [426/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:35.450 [427/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:35.450 [428/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:35.450 [429/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:35.450 [430/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:35.450 [431/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:35.450 [432/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:35.450 [433/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.450 [434/707] Linking static target drivers/librte_bus_pci.a 00:01:35.450 [435/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:35.450 [436/707] Linking static target lib/librte_cryptodev.a 00:01:35.450 [437/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:35.710 [438/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:35.710 [439/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:35.710 [440/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:35.710 [441/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:35.710 [442/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:35.710 [443/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:35.710 [444/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:35.710 [445/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:35.710 [446/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:35.710 [447/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:35.710 [448/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:35.710 [449/707] Linking static target lib/librte_ipsec.a 00:01:35.710 [450/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.710 [451/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:35.710 [452/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:35.710 [453/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:35.710 [454/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.710 [455/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:35.710 [456/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:35.710 [457/707] Linking static target lib/librte_member.a 00:01:35.710 [458/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:35.710 [459/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:35.710 [460/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:35.710 [461/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:35.710 [462/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:35.710 [463/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:35.710 [464/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:35.710 [465/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:35.710 [466/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:35.710 [467/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:35.710 [468/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:35.710 [469/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:35.969 [470/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:35.969 [471/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:35.969 [472/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:35.969 [473/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:35.969 [474/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:35.969 [475/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:35.969 [476/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:35.969 [477/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:35.969 [478/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:35.969 [479/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:35.969 [480/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:35.969 [481/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:35.969 [482/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.969 [483/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:35.969 [484/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.969 [485/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:35.969 [486/707] Linking static target lib/librte_pdcp.a 00:01:35.969 [487/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:35.969 [488/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:35.969 [489/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:35.969 [490/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:35.969 [491/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:35.969 [492/707] Linking static target lib/librte_node.a 00:01:35.969 [493/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:35.969 [494/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:35.969 [495/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:35.969 [496/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:35.969 [497/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:35.969 [498/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:35.969 [499/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:35.969 [500/707] Linking static target drivers/librte_mempool_ring.a 00:01:35.969 [501/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:35.969 [502/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:35.969 [503/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:35.969 [504/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:35.969 [505/707] Linking static target lib/librte_port.a 00:01:36.228 [506/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.228 [507/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:36.228 [508/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.228 [509/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:36.228 [510/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.228 [511/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:36.228 [512/707] Linking static target lib/librte_hash.a 00:01:36.228 [513/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:36.228 [514/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:36.228 [515/707] Linking static target lib/acl/libavx2_tmp.a 00:01:36.228 [516/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:36.228 [517/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.228 [518/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:36.228 [519/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:36.228 [520/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:36.228 [521/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:36.228 [522/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:36.228 [523/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:36.228 [524/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:36.228 [525/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:36.228 [526/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:36.228 [527/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:36.228 [528/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.228 [529/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:36.228 [530/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:36.486 [531/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:36.486 [532/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:36.486 [533/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:36.486 [534/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:36.486 [535/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:36.486 [536/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:36.486 [537/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:36.486 [538/707] Linking static target lib/librte_eventdev.a 00:01:36.486 [539/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.486 [540/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:36.486 [541/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:36.486 [542/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:36.486 [543/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:36.486 [544/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:36.486 [545/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:36.486 [546/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:36.486 [547/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:36.486 [548/707] Linking static target lib/librte_acl.a 00:01:36.486 [549/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:36.486 [550/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:36.486 [551/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:36.784 [552/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:36.784 [553/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:36.784 [554/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:36.784 [555/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:36.784 [556/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:36.784 [557/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:36.784 [558/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:36.784 [559/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:36.784 [560/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:36.784 [561/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:36.784 [562/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:36.784 [563/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:36.784 [564/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.784 [565/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:36.784 [566/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.784 [567/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:36.784 [568/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.042 [569/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:37.042 [570/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:37.300 [571/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:37.300 [572/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:01:37.300 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:37.300 [574/707] Linking static target lib/librte_ethdev.a 00:01:37.300 [575/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.559 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:37.817 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:37.817 [578/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:38.076 [579/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:38.334 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:38.593 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:38.593 [582/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:38.593 [583/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:38.852 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:38.852 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:38.852 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:38.852 [587/707] Linking static target drivers/librte_net_i40e.a 00:01:39.420 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:39.990 [589/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.990 [590/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:39.990 [591/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.558 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:45.832 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.833 [594/707] Linking target lib/librte_eal.so.24.0 00:01:45.833 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:01:45.833 [596/707] Linking target lib/librte_ring.so.24.0 00:01:45.833 [597/707] Linking target lib/librte_pci.so.24.0 00:01:45.833 [598/707] Linking target lib/librte_meter.so.24.0 00:01:45.833 [599/707] Linking target drivers/librte_bus_vdev.so.24.0 00:01:45.833 [600/707] Linking target lib/librte_cfgfile.so.24.0 00:01:45.833 [601/707] Linking target lib/librte_timer.so.24.0 00:01:45.833 [602/707] Linking target lib/librte_dmadev.so.24.0 00:01:45.833 [603/707] Linking target lib/librte_jobstats.so.24.0 00:01:45.833 [604/707] Linking target lib/librte_rawdev.so.24.0 00:01:45.833 [605/707] Linking target lib/librte_stack.so.24.0 00:01:45.833 [606/707] Linking target lib/librte_acl.so.24.0 00:01:45.833 [607/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.833 [608/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:01:45.833 [609/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:01:45.833 [610/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:01:45.833 [611/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:01:45.833 [612/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:01:45.833 [613/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:01:45.833 [614/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:01:45.833 [615/707] Linking target lib/librte_mempool.so.24.0 00:01:45.833 [616/707] Linking target lib/librte_rcu.so.24.0 00:01:45.833 [617/707] Linking target drivers/librte_bus_pci.so.24.0 00:01:45.833 [618/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:01:45.833 [619/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:01:45.833 [620/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:01:45.833 [621/707] Linking target lib/librte_mbuf.so.24.0 00:01:45.833 [622/707] Linking target drivers/librte_mempool_ring.so.24.0 00:01:45.833 [623/707] Linking target lib/librte_rib.so.24.0 00:01:46.092 [624/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:01:46.092 [625/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:01:46.092 [626/707] Linking target lib/librte_cryptodev.so.24.0 00:01:46.092 [627/707] Linking target lib/librte_mldev.so.24.0 00:01:46.092 [628/707] Linking target lib/librte_distributor.so.24.0 00:01:46.092 [629/707] Linking target lib/librte_regexdev.so.24.0 00:01:46.092 [630/707] Linking target lib/librte_bbdev.so.24.0 00:01:46.092 [631/707] Linking target lib/librte_gpudev.so.24.0 00:01:46.092 [632/707] Linking target lib/librte_reorder.so.24.0 00:01:46.092 [633/707] Linking target lib/librte_net.so.24.0 00:01:46.092 [634/707] Linking target lib/librte_compressdev.so.24.0 00:01:46.092 [635/707] Linking target lib/librte_sched.so.24.0 00:01:46.092 [636/707] Linking target lib/librte_fib.so.24.0 00:01:46.092 [637/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:01:46.092 [638/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:01:46.092 [639/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:01:46.092 [640/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:01:46.092 [641/707] Linking target lib/librte_hash.so.24.0 00:01:46.092 [642/707] Linking target lib/librte_cmdline.so.24.0 00:01:46.092 [643/707] Linking target lib/librte_security.so.24.0 00:01:46.351 [644/707] Linking target lib/librte_ethdev.so.24.0 00:01:46.351 [645/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:46.351 [646/707] Linking static target lib/librte_pipeline.a 00:01:46.351 [647/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:01:46.351 [648/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:01:46.351 [649/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:01:46.351 [650/707] Linking target lib/librte_efd.so.24.0 00:01:46.351 [651/707] Linking target lib/librte_lpm.so.24.0 00:01:46.351 [652/707] Linking target lib/librte_pdcp.so.24.0 00:01:46.351 [653/707] Linking target lib/librte_member.so.24.0 00:01:46.351 [654/707] Linking target lib/librte_ipsec.so.24.0 00:01:46.351 [655/707] Linking target lib/librte_metrics.so.24.0 00:01:46.351 [656/707] Linking target lib/librte_gso.so.24.0 00:01:46.351 [657/707] Linking target lib/librte_pcapng.so.24.0 00:01:46.351 [658/707] Linking target lib/librte_eventdev.so.24.0 00:01:46.351 [659/707] Linking target lib/librte_bpf.so.24.0 00:01:46.351 [660/707] Linking target lib/librte_ip_frag.so.24.0 00:01:46.351 [661/707] Linking target lib/librte_gro.so.24.0 00:01:46.351 [662/707] Linking target lib/librte_power.so.24.0 00:01:46.351 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:01:46.611 [664/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:01:46.611 [665/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:01:46.611 [666/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:01:46.611 [667/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:01:46.611 [668/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:01:46.611 [669/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:01:46.611 [670/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:01:46.611 [671/707] Linking target lib/librte_bitratestats.so.24.0 00:01:46.611 [672/707] Linking target lib/librte_latencystats.so.24.0 00:01:46.611 [673/707] Linking target lib/librte_graph.so.24.0 00:01:46.611 [674/707] Linking target lib/librte_dispatcher.so.24.0 00:01:46.611 [675/707] Linking target lib/librte_pdump.so.24.0 00:01:46.611 [676/707] Linking target lib/librte_port.so.24.0 00:01:46.870 [677/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:01:46.870 [678/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:46.870 [679/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:01:46.870 [680/707] Linking static target lib/librte_vhost.a 00:01:46.870 [681/707] Linking target lib/librte_node.so.24.0 00:01:46.870 [682/707] Linking target lib/librte_table.so.24.0 00:01:46.870 [683/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:01:47.129 [684/707] Linking target app/dpdk-proc-info 00:01:47.129 [685/707] Linking target app/dpdk-graph 00:01:47.129 [686/707] Linking target app/dpdk-pdump 00:01:47.129 [687/707] Linking target app/dpdk-test-acl 00:01:47.129 [688/707] Linking target app/dpdk-dumpcap 00:01:47.388 [689/707] Linking target app/dpdk-test-regex 00:01:47.388 [690/707] Linking target app/dpdk-test-gpudev 00:01:47.388 [691/707] Linking target app/dpdk-test-cmdline 00:01:47.388 [692/707] Linking target app/dpdk-test-dma-perf 00:01:47.388 [693/707] Linking target app/dpdk-test-fib 00:01:47.388 [694/707] Linking target app/dpdk-test-flow-perf 00:01:47.388 [695/707] Linking target app/dpdk-test-sad 00:01:47.388 [696/707] Linking target app/dpdk-test-bbdev 00:01:47.388 [697/707] Linking target app/dpdk-test-security-perf 00:01:47.388 [698/707] Linking target app/dpdk-test-mldev 00:01:47.388 [699/707] Linking target app/dpdk-test-crypto-perf 00:01:47.388 [700/707] Linking target app/dpdk-test-compress-perf 00:01:47.388 [701/707] Linking target app/dpdk-test-pipeline 00:01:47.388 [702/707] Linking target app/dpdk-test-eventdev 00:01:47.388 [703/707] Linking target app/dpdk-testpmd 00:01:49.296 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.296 [705/707] Linking target lib/librte_vhost.so.24.0 00:01:51.834 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.834 [707/707] Linking target lib/librte_pipeline.so.24.0 00:01:51.834 06:44:21 -- common/autobuild_common.sh@187 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:01:51.834 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:51.834 [0/1] Installing files. 00:01:52.098 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.098 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.099 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.100 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.101 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.102 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.103 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.104 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:52.105 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:52.105 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.105 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.106 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.106 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.106 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.106 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.106 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.106 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.106 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:52.367 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:52.367 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:52.367 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.367 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:52.367 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.367 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.368 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.368 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.368 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.368 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.368 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.368 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.369 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.632 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.633 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:52.634 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:52.634 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:01:52.634 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:01:52.634 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:01:52.634 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:01:52.634 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:01:52.634 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:01:52.634 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:01:52.634 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:01:52.634 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:01:52.634 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:01:52.634 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:01:52.634 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:01:52.635 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:01:52.635 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:01:52.635 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:01:52.635 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:01:52.635 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:01:52.635 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:01:52.635 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:01:52.635 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:01:52.635 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:01:52.635 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:01:52.635 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:01:52.635 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:01:52.635 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:01:52.635 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:01:52.635 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:01:52.635 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:01:52.635 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:01:52.635 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:01:52.635 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:01:52.635 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:01:52.635 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:01:52.635 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:01:52.635 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:01:52.635 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:01:52.635 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:01:52.635 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:01:52.635 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:01:52.635 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:01:52.635 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:01:52.635 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:01:52.635 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:01:52.635 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:01:52.635 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:01:52.635 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:01:52.635 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:01:52.635 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:01:52.635 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:01:52.635 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:01:52.635 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:01:52.635 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:01:52.635 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:01:52.635 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:01:52.635 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:01:52.635 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:01:52.635 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:01:52.635 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:01:52.635 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:01:52.635 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:01:52.635 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:01:52.635 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:01:52.635 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:01:52.635 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:01:52.635 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:01:52.635 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:01:52.635 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:01:52.635 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:01:52.635 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:01:52.635 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:01:52.635 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:01:52.635 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:01:52.635 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:01:52.635 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:01:52.635 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:01:52.635 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:01:52.635 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:01:52.635 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:01:52.635 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:01:52.635 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:01:52.635 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:01:52.635 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:01:52.635 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:01:52.635 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:01:52.635 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:01:52.635 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:01:52.635 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:01:52.635 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:01:52.635 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:01:52.635 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:01:52.635 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:01:52.635 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:01:52.635 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:01:52.635 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:01:52.635 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:01:52.635 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:01:52.635 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:01:52.635 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:01:52.635 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:01:52.635 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:01:52.635 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:01:52.635 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:01:52.635 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:01:52.636 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:01:52.636 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:01:52.636 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:01:52.636 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:01:52.636 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:01:52.636 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:01:52.636 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:01:52.636 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:01:52.636 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:01:52.636 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:01:52.636 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:01:52.636 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:01:52.636 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:01:52.636 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:01:52.636 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:01:52.636 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:01:52.636 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:01:52.636 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:01:52.636 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:01:52.636 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:01:52.636 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:01:52.636 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:01:52.636 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:01:52.636 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:01:52.636 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:01:52.636 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:01:52.636 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:01:52.636 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:01:52.636 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:01:52.636 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:01:52.636 06:44:22 -- common/autobuild_common.sh@189 -- $ uname -s 00:01:52.636 06:44:22 -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:01:52.636 06:44:22 -- common/autobuild_common.sh@200 -- $ cat 00:01:52.636 06:44:22 -- common/autobuild_common.sh@205 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:52.636 00:01:52.636 real 0m27.203s 00:01:52.636 user 7m59.234s 00:01:52.636 sys 2m32.655s 00:01:52.636 06:44:22 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:52.636 06:44:22 -- common/autotest_common.sh@10 -- $ set +x 00:01:52.636 ************************************ 00:01:52.636 END TEST build_native_dpdk 00:01:52.636 ************************************ 00:01:52.636 06:44:22 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:52.636 06:44:22 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:52.636 06:44:22 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:52.636 06:44:22 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:52.636 06:44:22 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:52.636 06:44:22 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:52.636 06:44:22 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:52.636 06:44:22 -- common/autotest_common.sh@10 -- $ set +x 00:01:52.636 ************************************ 00:01:52.636 START TEST autobuild_llvm_precompile 00:01:52.636 ************************************ 00:01:52.636 06:44:22 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:01:52.636 06:44:22 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:52.636 06:44:22 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:01:52.636 Target: x86_64-redhat-linux-gnu 00:01:52.636 Thread model: posix 00:01:52.636 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:52.636 06:44:22 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:01:52.636 06:44:22 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:01:52.636 06:44:22 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:01:52.636 06:44:22 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:01:52.636 06:44:22 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:01:52.636 06:44:22 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:01:52.636 06:44:22 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:52.636 06:44:22 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:01:52.636 06:44:22 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:01:52.636 06:44:22 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:52.896 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:53.157 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.158 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.158 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:53.724 Using 'verbs' RDMA provider 00:02:09.183 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:21.413 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:21.673 Creating mk/config.mk...done. 00:02:21.673 Creating mk/cc.flags.mk...done. 00:02:21.673 Type 'make' to build. 00:02:21.673 00:02:21.673 real 0m29.057s 00:02:21.673 user 0m12.333s 00:02:21.673 sys 0m15.875s 00:02:21.673 06:44:51 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:21.673 06:44:51 -- common/autotest_common.sh@10 -- $ set +x 00:02:21.673 ************************************ 00:02:21.673 END TEST autobuild_llvm_precompile 00:02:21.673 ************************************ 00:02:21.673 06:44:51 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:21.673 06:44:51 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:21.673 06:44:51 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:21.673 06:44:51 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:21.673 06:44:51 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:21.931 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:22.190 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.190 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.190 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:22.449 Using 'verbs' RDMA provider 00:02:35.702 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:47.917 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:47.917 Creating mk/config.mk...done. 00:02:47.917 Creating mk/cc.flags.mk...done. 00:02:47.917 Type 'make' to build. 00:02:47.917 06:45:16 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:47.917 06:45:16 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:47.917 06:45:16 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:47.917 06:45:16 -- common/autotest_common.sh@10 -- $ set +x 00:02:47.917 ************************************ 00:02:47.917 START TEST make 00:02:47.917 ************************************ 00:02:47.917 06:45:16 -- common/autotest_common.sh@1104 -- $ make -j112 00:02:47.917 make[1]: Nothing to be done for 'all'. 00:02:49.297 The Meson build system 00:02:49.297 Version: 1.3.1 00:02:49.297 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:49.297 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:49.297 Build type: native build 00:02:49.297 Project name: libvfio-user 00:02:49.297 Project version: 0.0.1 00:02:49.297 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:49.297 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:49.297 Host machine cpu family: x86_64 00:02:49.297 Host machine cpu: x86_64 00:02:49.297 Run-time dependency threads found: YES 00:02:49.297 Library dl found: YES 00:02:49.297 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:49.297 Run-time dependency json-c found: YES 0.17 00:02:49.297 Run-time dependency cmocka found: YES 1.1.7 00:02:49.297 Program pytest-3 found: NO 00:02:49.297 Program flake8 found: NO 00:02:49.297 Program misspell-fixer found: NO 00:02:49.297 Program restructuredtext-lint found: NO 00:02:49.297 Program valgrind found: YES (/usr/bin/valgrind) 00:02:49.297 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:49.297 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:49.297 Compiler for C supports arguments -Wwrite-strings: YES 00:02:49.297 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:49.297 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:49.297 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:49.297 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:49.297 Build targets in project: 8 00:02:49.297 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:49.297 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:49.297 00:02:49.297 libvfio-user 0.0.1 00:02:49.297 00:02:49.297 User defined options 00:02:49.297 buildtype : debug 00:02:49.297 default_library: static 00:02:49.297 libdir : /usr/local/lib 00:02:49.297 00:02:49.297 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:49.297 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:49.557 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:49.557 [2/36] Compiling C object samples/null.p/null.c.o 00:02:49.557 [3/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:49.557 [4/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:49.557 [5/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:49.557 [6/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:49.557 [7/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:49.557 [8/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:49.557 [9/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:49.557 [10/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:49.557 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:49.557 [12/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:49.557 [13/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:49.557 [14/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:49.557 [15/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:49.557 [16/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:49.557 [17/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:49.557 [18/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:49.557 [19/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:49.557 [20/36] Compiling C object samples/server.p/server.c.o 00:02:49.557 [21/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:49.557 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:49.557 [23/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:49.557 [24/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:49.557 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:49.557 [26/36] Compiling C object samples/client.p/client.c.o 00:02:49.557 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:49.557 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:49.557 [29/36] Linking static target lib/libvfio-user.a 00:02:49.557 [30/36] Linking target samples/client 00:02:49.557 [31/36] Linking target test/unit_tests 00:02:49.557 [32/36] Linking target samples/gpio-pci-idio-16 00:02:49.557 [33/36] Linking target samples/server 00:02:49.557 [34/36] Linking target samples/null 00:02:49.557 [35/36] Linking target samples/lspci 00:02:49.557 [36/36] Linking target samples/shadow_ioeventfd_server 00:02:49.557 INFO: autodetecting backend as ninja 00:02:49.557 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:49.817 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:50.076 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:50.076 ninja: no work to do. 00:02:53.369 CC lib/ut_mock/mock.o 00:02:53.369 CC lib/log/log.o 00:02:53.369 CC lib/log/log_flags.o 00:02:53.369 CC lib/log/log_deprecated.o 00:02:53.369 CC lib/ut/ut.o 00:02:53.369 LIB libspdk_ut_mock.a 00:02:53.369 LIB libspdk_log.a 00:02:53.369 LIB libspdk_ut.a 00:02:53.369 CXX lib/trace_parser/trace.o 00:02:53.627 CC lib/dma/dma.o 00:02:53.627 CC lib/util/base64.o 00:02:53.627 CC lib/util/bit_array.o 00:02:53.627 CC lib/util/crc16.o 00:02:53.627 CC lib/util/cpuset.o 00:02:53.627 CC lib/util/crc32.o 00:02:53.627 CC lib/util/crc32c.o 00:02:53.627 CC lib/util/crc64.o 00:02:53.627 CC lib/util/crc32_ieee.o 00:02:53.627 CC lib/util/dif.o 00:02:53.627 CC lib/util/fd.o 00:02:53.627 CC lib/util/file.o 00:02:53.627 CC lib/util/hexlify.o 00:02:53.627 CC lib/ioat/ioat.o 00:02:53.627 CC lib/util/iov.o 00:02:53.627 CC lib/util/math.o 00:02:53.627 CC lib/util/pipe.o 00:02:53.627 CC lib/util/strerror_tls.o 00:02:53.627 CC lib/util/string.o 00:02:53.627 CC lib/util/uuid.o 00:02:53.627 CC lib/util/fd_group.o 00:02:53.627 CC lib/util/xor.o 00:02:53.627 CC lib/util/zipf.o 00:02:53.627 LIB libspdk_dma.a 00:02:53.627 CC lib/vfio_user/host/vfio_user_pci.o 00:02:53.627 CC lib/vfio_user/host/vfio_user.o 00:02:53.627 LIB libspdk_ioat.a 00:02:53.886 LIB libspdk_vfio_user.a 00:02:53.886 LIB libspdk_util.a 00:02:53.886 LIB libspdk_trace_parser.a 00:02:54.143 CC lib/vmd/led.o 00:02:54.143 CC lib/vmd/vmd.o 00:02:54.143 CC lib/rdma/common.o 00:02:54.143 CC lib/rdma/rdma_verbs.o 00:02:54.143 CC lib/json/json_parse.o 00:02:54.143 CC lib/json/json_util.o 00:02:54.143 CC lib/json/json_write.o 00:02:54.143 CC lib/conf/conf.o 00:02:54.143 CC lib/env_dpdk/memory.o 00:02:54.144 CC lib/idxd/idxd.o 00:02:54.144 CC lib/idxd/idxd_user.o 00:02:54.144 CC lib/env_dpdk/env.o 00:02:54.144 CC lib/env_dpdk/pci.o 00:02:54.144 CC lib/env_dpdk/init.o 00:02:54.144 CC lib/env_dpdk/threads.o 00:02:54.144 CC lib/env_dpdk/pci_ioat.o 00:02:54.144 CC lib/env_dpdk/pci_virtio.o 00:02:54.144 CC lib/env_dpdk/pci_vmd.o 00:02:54.144 CC lib/env_dpdk/pci_idxd.o 00:02:54.144 CC lib/env_dpdk/pci_event.o 00:02:54.144 CC lib/env_dpdk/sigbus_handler.o 00:02:54.144 CC lib/env_dpdk/pci_dpdk.o 00:02:54.144 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:54.144 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:54.402 LIB libspdk_conf.a 00:02:54.402 LIB libspdk_rdma.a 00:02:54.402 LIB libspdk_json.a 00:02:54.402 LIB libspdk_vmd.a 00:02:54.402 LIB libspdk_idxd.a 00:02:54.659 CC lib/jsonrpc/jsonrpc_server.o 00:02:54.659 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:54.659 CC lib/jsonrpc/jsonrpc_client.o 00:02:54.659 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:54.918 LIB libspdk_jsonrpc.a 00:02:55.178 LIB libspdk_env_dpdk.a 00:02:55.178 CC lib/rpc/rpc.o 00:02:55.178 LIB libspdk_rpc.a 00:02:55.436 CC lib/notify/notify.o 00:02:55.436 CC lib/notify/notify_rpc.o 00:02:55.436 CC lib/trace/trace.o 00:02:55.436 CC lib/trace/trace_flags.o 00:02:55.436 CC lib/trace/trace_rpc.o 00:02:55.436 CC lib/sock/sock.o 00:02:55.436 CC lib/sock/sock_rpc.o 00:02:55.696 LIB libspdk_notify.a 00:02:55.696 LIB libspdk_trace.a 00:02:55.696 LIB libspdk_sock.a 00:02:55.955 CC lib/thread/thread.o 00:02:55.955 CC lib/thread/iobuf.o 00:02:56.214 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:56.214 CC lib/nvme/nvme_fabric.o 00:02:56.214 CC lib/nvme/nvme_ctrlr.o 00:02:56.214 CC lib/nvme/nvme_ns_cmd.o 00:02:56.214 CC lib/nvme/nvme_ns.o 00:02:56.214 CC lib/nvme/nvme_pcie_common.o 00:02:56.214 CC lib/nvme/nvme_pcie.o 00:02:56.214 CC lib/nvme/nvme_quirks.o 00:02:56.214 CC lib/nvme/nvme_qpair.o 00:02:56.214 CC lib/nvme/nvme.o 00:02:56.214 CC lib/nvme/nvme_transport.o 00:02:56.214 CC lib/nvme/nvme_discovery.o 00:02:56.214 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:56.214 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:56.214 CC lib/nvme/nvme_tcp.o 00:02:56.214 CC lib/nvme/nvme_opal.o 00:02:56.214 CC lib/nvme/nvme_io_msg.o 00:02:56.214 CC lib/nvme/nvme_poll_group.o 00:02:56.214 CC lib/nvme/nvme_zns.o 00:02:56.214 CC lib/nvme/nvme_cuse.o 00:02:56.214 CC lib/nvme/nvme_vfio_user.o 00:02:56.214 CC lib/nvme/nvme_rdma.o 00:02:56.782 LIB libspdk_thread.a 00:02:57.041 CC lib/virtio/virtio.o 00:02:57.041 CC lib/blob/blobstore.o 00:02:57.041 CC lib/virtio/virtio_vhost_user.o 00:02:57.041 CC lib/virtio/virtio_pci.o 00:02:57.041 CC lib/virtio/virtio_vfio_user.o 00:02:57.041 CC lib/blob/request.o 00:02:57.041 CC lib/blob/zeroes.o 00:02:57.041 CC lib/blob/blob_bs_dev.o 00:02:57.041 CC lib/init/json_config.o 00:02:57.041 CC lib/accel/accel.o 00:02:57.041 CC lib/vfu_tgt/tgt_endpoint.o 00:02:57.041 CC lib/accel/accel_rpc.o 00:02:57.041 CC lib/accel/accel_sw.o 00:02:57.041 CC lib/vfu_tgt/tgt_rpc.o 00:02:57.041 CC lib/init/subsystem.o 00:02:57.041 CC lib/init/rpc.o 00:02:57.041 CC lib/init/subsystem_rpc.o 00:02:57.300 LIB libspdk_init.a 00:02:57.300 LIB libspdk_virtio.a 00:02:57.300 LIB libspdk_vfu_tgt.a 00:02:57.300 LIB libspdk_nvme.a 00:02:57.560 CC lib/event/app.o 00:02:57.560 CC lib/event/reactor.o 00:02:57.560 CC lib/event/log_rpc.o 00:02:57.560 CC lib/event/app_rpc.o 00:02:57.560 CC lib/event/scheduler_static.o 00:02:57.820 LIB libspdk_event.a 00:02:57.820 LIB libspdk_accel.a 00:02:58.079 CC lib/bdev/bdev.o 00:02:58.079 CC lib/bdev/bdev_rpc.o 00:02:58.079 CC lib/bdev/bdev_zone.o 00:02:58.079 CC lib/bdev/part.o 00:02:58.079 CC lib/bdev/scsi_nvme.o 00:02:58.646 LIB libspdk_blob.a 00:02:58.905 CC lib/blobfs/blobfs.o 00:02:58.905 CC lib/blobfs/tree.o 00:02:58.905 CC lib/lvol/lvol.o 00:02:59.474 LIB libspdk_lvol.a 00:02:59.474 LIB libspdk_blobfs.a 00:02:59.733 LIB libspdk_bdev.a 00:02:59.992 CC lib/scsi/dev.o 00:02:59.992 CC lib/scsi/lun.o 00:02:59.992 CC lib/scsi/port.o 00:02:59.992 CC lib/scsi/scsi.o 00:02:59.992 CC lib/scsi/scsi_bdev.o 00:02:59.992 CC lib/scsi/scsi_pr.o 00:02:59.992 CC lib/scsi/scsi_rpc.o 00:02:59.992 CC lib/scsi/task.o 00:02:59.992 CC lib/ftl/ftl_init.o 00:02:59.992 CC lib/nvmf/ctrlr.o 00:02:59.992 CC lib/ftl/ftl_core.o 00:02:59.992 CC lib/nvmf/ctrlr_discovery.o 00:02:59.992 CC lib/ftl/ftl_io.o 00:02:59.992 CC lib/ftl/ftl_layout.o 00:02:59.992 CC lib/nvmf/subsystem.o 00:02:59.992 CC lib/nvmf/ctrlr_bdev.o 00:02:59.992 CC lib/ftl/ftl_debug.o 00:02:59.992 CC lib/ftl/ftl_l2p.o 00:02:59.992 CC lib/ftl/ftl_sb.o 00:02:59.992 CC lib/nvmf/nvmf.o 00:02:59.992 CC lib/nvmf/transport.o 00:02:59.992 CC lib/nvmf/nvmf_rpc.o 00:02:59.992 CC lib/ftl/ftl_l2p_flat.o 00:02:59.992 CC lib/ftl/ftl_nv_cache.o 00:02:59.992 CC lib/nvmf/tcp.o 00:02:59.992 CC lib/ftl/ftl_band.o 00:02:59.992 CC lib/nvmf/vfio_user.o 00:02:59.992 CC lib/ftl/ftl_band_ops.o 00:02:59.992 CC lib/nvmf/rdma.o 00:02:59.992 CC lib/ftl/ftl_rq.o 00:02:59.992 CC lib/ftl/ftl_writer.o 00:02:59.992 CC lib/ftl/ftl_reloc.o 00:02:59.992 CC lib/ftl/ftl_l2p_cache.o 00:02:59.992 CC lib/ftl/ftl_p2l.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:59.992 CC lib/ublk/ublk.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:59.992 CC lib/ublk/ublk_rpc.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:59.992 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:59.992 CC lib/ftl/utils/ftl_conf.o 00:02:59.992 CC lib/nbd/nbd.o 00:02:59.992 CC lib/ftl/utils/ftl_md.o 00:02:59.992 CC lib/ftl/utils/ftl_bitmap.o 00:02:59.992 CC lib/nbd/nbd_rpc.o 00:02:59.992 CC lib/ftl/utils/ftl_mempool.o 00:02:59.992 CC lib/ftl/utils/ftl_property.o 00:02:59.992 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:59.992 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:59.992 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:59.992 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:59.992 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:59.992 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:59.992 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:59.992 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:59.992 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:59.992 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:59.992 CC lib/ftl/base/ftl_base_bdev.o 00:02:59.992 CC lib/ftl/base/ftl_base_dev.o 00:02:59.992 CC lib/ftl/ftl_trace.o 00:03:00.251 LIB libspdk_scsi.a 00:03:00.511 LIB libspdk_nbd.a 00:03:00.511 LIB libspdk_ublk.a 00:03:00.770 CC lib/iscsi/conn.o 00:03:00.770 CC lib/iscsi/init_grp.o 00:03:00.770 CC lib/iscsi/param.o 00:03:00.770 CC lib/iscsi/iscsi.o 00:03:00.770 CC lib/iscsi/md5.o 00:03:00.770 CC lib/iscsi/portal_grp.o 00:03:00.770 CC lib/iscsi/tgt_node.o 00:03:00.770 CC lib/iscsi/task.o 00:03:00.770 CC lib/iscsi/iscsi_subsystem.o 00:03:00.770 CC lib/iscsi/iscsi_rpc.o 00:03:00.770 CC lib/vhost/vhost_rpc.o 00:03:00.770 CC lib/vhost/vhost.o 00:03:00.770 CC lib/vhost/vhost_scsi.o 00:03:00.770 CC lib/vhost/vhost_blk.o 00:03:00.770 CC lib/vhost/rte_vhost_user.o 00:03:00.770 LIB libspdk_ftl.a 00:03:01.339 LIB libspdk_nvmf.a 00:03:01.339 LIB libspdk_vhost.a 00:03:01.339 LIB libspdk_iscsi.a 00:03:01.909 CC module/vfu_device/vfu_virtio.o 00:03:01.909 CC module/vfu_device/vfu_virtio_scsi.o 00:03:01.909 CC module/vfu_device/vfu_virtio_blk.o 00:03:01.909 CC module/vfu_device/vfu_virtio_rpc.o 00:03:01.909 CC module/env_dpdk/env_dpdk_rpc.o 00:03:01.909 CC module/blob/bdev/blob_bdev.o 00:03:01.909 CC module/accel/iaa/accel_iaa.o 00:03:01.909 CC module/accel/error/accel_error_rpc.o 00:03:01.909 CC module/accel/iaa/accel_iaa_rpc.o 00:03:01.909 CC module/accel/error/accel_error.o 00:03:01.909 CC module/accel/ioat/accel_ioat.o 00:03:01.909 CC module/accel/ioat/accel_ioat_rpc.o 00:03:02.168 CC module/scheduler/gscheduler/gscheduler.o 00:03:02.168 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:02.168 CC module/sock/posix/posix.o 00:03:02.168 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:02.168 LIB libspdk_env_dpdk_rpc.a 00:03:02.168 CC module/accel/dsa/accel_dsa.o 00:03:02.168 CC module/accel/dsa/accel_dsa_rpc.o 00:03:02.168 LIB libspdk_accel_error.a 00:03:02.168 LIB libspdk_scheduler_dpdk_governor.a 00:03:02.168 LIB libspdk_scheduler_gscheduler.a 00:03:02.169 LIB libspdk_accel_iaa.a 00:03:02.169 LIB libspdk_accel_ioat.a 00:03:02.169 LIB libspdk_scheduler_dynamic.a 00:03:02.169 LIB libspdk_blob_bdev.a 00:03:02.169 LIB libspdk_accel_dsa.a 00:03:02.428 LIB libspdk_vfu_device.a 00:03:02.428 LIB libspdk_sock_posix.a 00:03:02.688 CC module/blobfs/bdev/blobfs_bdev.o 00:03:02.688 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:02.688 CC module/bdev/error/vbdev_error.o 00:03:02.688 CC module/bdev/error/vbdev_error_rpc.o 00:03:02.688 CC module/bdev/delay/vbdev_delay.o 00:03:02.688 CC module/bdev/nvme/bdev_nvme.o 00:03:02.688 CC module/bdev/nvme/nvme_rpc.o 00:03:02.688 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:02.688 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:02.688 CC module/bdev/malloc/bdev_malloc.o 00:03:02.688 CC module/bdev/nvme/bdev_mdns_client.o 00:03:02.688 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:02.688 CC module/bdev/nvme/vbdev_opal.o 00:03:02.688 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:02.688 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:02.688 CC module/bdev/null/bdev_null.o 00:03:02.688 CC module/bdev/null/bdev_null_rpc.o 00:03:02.688 CC module/bdev/split/vbdev_split.o 00:03:02.688 CC module/bdev/split/vbdev_split_rpc.o 00:03:02.688 CC module/bdev/aio/bdev_aio_rpc.o 00:03:02.688 CC module/bdev/lvol/vbdev_lvol.o 00:03:02.688 CC module/bdev/aio/bdev_aio.o 00:03:02.688 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:02.688 CC module/bdev/passthru/vbdev_passthru.o 00:03:02.688 CC module/bdev/iscsi/bdev_iscsi.o 00:03:02.688 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:02.688 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:02.688 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:02.688 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:02.688 CC module/bdev/raid/bdev_raid.o 00:03:02.688 CC module/bdev/raid/bdev_raid_rpc.o 00:03:02.688 CC module/bdev/raid/raid0.o 00:03:02.688 CC module/bdev/raid/bdev_raid_sb.o 00:03:02.688 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:02.688 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:02.688 CC module/bdev/gpt/gpt.o 00:03:02.688 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:02.688 CC module/bdev/raid/raid1.o 00:03:02.688 CC module/bdev/raid/concat.o 00:03:02.688 CC module/bdev/gpt/vbdev_gpt.o 00:03:02.688 CC module/bdev/ftl/bdev_ftl.o 00:03:02.688 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:02.688 LIB libspdk_blobfs_bdev.a 00:03:02.688 LIB libspdk_bdev_error.a 00:03:02.947 LIB libspdk_bdev_split.a 00:03:02.947 LIB libspdk_bdev_null.a 00:03:02.947 LIB libspdk_bdev_gpt.a 00:03:02.947 LIB libspdk_bdev_passthru.a 00:03:02.947 LIB libspdk_bdev_ftl.a 00:03:02.947 LIB libspdk_bdev_aio.a 00:03:02.947 LIB libspdk_bdev_zone_block.a 00:03:02.947 LIB libspdk_bdev_iscsi.a 00:03:02.947 LIB libspdk_bdev_malloc.a 00:03:02.947 LIB libspdk_bdev_delay.a 00:03:02.947 LIB libspdk_bdev_lvol.a 00:03:02.947 LIB libspdk_bdev_virtio.a 00:03:03.206 LIB libspdk_bdev_raid.a 00:03:03.773 LIB libspdk_bdev_nvme.a 00:03:04.342 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:04.342 CC module/event/subsystems/iobuf/iobuf.o 00:03:04.342 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:04.342 CC module/event/subsystems/sock/sock.o 00:03:04.342 CC module/event/subsystems/scheduler/scheduler.o 00:03:04.342 CC module/event/subsystems/vmd/vmd.o 00:03:04.342 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:04.342 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:04.601 LIB libspdk_event_iobuf.a 00:03:04.601 LIB libspdk_event_vhost_blk.a 00:03:04.601 LIB libspdk_event_sock.a 00:03:04.601 LIB libspdk_event_vfu_tgt.a 00:03:04.601 LIB libspdk_event_scheduler.a 00:03:04.601 LIB libspdk_event_vmd.a 00:03:04.860 CC module/event/subsystems/accel/accel.o 00:03:04.860 LIB libspdk_event_accel.a 00:03:05.119 CC module/event/subsystems/bdev/bdev.o 00:03:05.377 LIB libspdk_event_bdev.a 00:03:05.636 CC module/event/subsystems/nbd/nbd.o 00:03:05.636 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:05.636 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:05.636 CC module/event/subsystems/ublk/ublk.o 00:03:05.636 CC module/event/subsystems/scsi/scsi.o 00:03:05.636 LIB libspdk_event_nbd.a 00:03:05.895 LIB libspdk_event_ublk.a 00:03:05.895 LIB libspdk_event_scsi.a 00:03:05.895 LIB libspdk_event_nvmf.a 00:03:06.155 CC module/event/subsystems/iscsi/iscsi.o 00:03:06.155 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:06.155 LIB libspdk_event_iscsi.a 00:03:06.155 LIB libspdk_event_vhost_scsi.a 00:03:06.414 TEST_HEADER include/spdk/accel.h 00:03:06.414 TEST_HEADER include/spdk/accel_module.h 00:03:06.414 TEST_HEADER include/spdk/assert.h 00:03:06.414 TEST_HEADER include/spdk/barrier.h 00:03:06.414 TEST_HEADER include/spdk/bdev.h 00:03:06.414 TEST_HEADER include/spdk/base64.h 00:03:06.414 TEST_HEADER include/spdk/bdev_zone.h 00:03:06.414 TEST_HEADER include/spdk/bdev_module.h 00:03:06.414 TEST_HEADER include/spdk/bit_array.h 00:03:06.414 TEST_HEADER include/spdk/bit_pool.h 00:03:06.414 TEST_HEADER include/spdk/blob_bdev.h 00:03:06.414 TEST_HEADER include/spdk/blobfs.h 00:03:06.414 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:06.414 TEST_HEADER include/spdk/conf.h 00:03:06.414 TEST_HEADER include/spdk/blob.h 00:03:06.677 TEST_HEADER include/spdk/config.h 00:03:06.677 TEST_HEADER include/spdk/cpuset.h 00:03:06.678 TEST_HEADER include/spdk/crc16.h 00:03:06.678 TEST_HEADER include/spdk/crc32.h 00:03:06.678 TEST_HEADER include/spdk/crc64.h 00:03:06.678 TEST_HEADER include/spdk/dif.h 00:03:06.678 CC test/rpc_client/rpc_client_test.o 00:03:06.678 TEST_HEADER include/spdk/endian.h 00:03:06.678 TEST_HEADER include/spdk/dma.h 00:03:06.678 TEST_HEADER include/spdk/env.h 00:03:06.678 TEST_HEADER include/spdk/env_dpdk.h 00:03:06.678 TEST_HEADER include/spdk/event.h 00:03:06.678 TEST_HEADER include/spdk/fd_group.h 00:03:06.678 TEST_HEADER include/spdk/fd.h 00:03:06.678 TEST_HEADER include/spdk/file.h 00:03:06.678 TEST_HEADER include/spdk/ftl.h 00:03:06.678 TEST_HEADER include/spdk/gpt_spec.h 00:03:06.678 CXX app/trace/trace.o 00:03:06.678 TEST_HEADER include/spdk/histogram_data.h 00:03:06.678 TEST_HEADER include/spdk/hexlify.h 00:03:06.678 TEST_HEADER include/spdk/idxd.h 00:03:06.678 CC app/spdk_nvme_identify/identify.o 00:03:06.678 TEST_HEADER include/spdk/idxd_spec.h 00:03:06.678 CC app/trace_record/trace_record.o 00:03:06.678 TEST_HEADER include/spdk/init.h 00:03:06.678 TEST_HEADER include/spdk/ioat.h 00:03:06.678 TEST_HEADER include/spdk/ioat_spec.h 00:03:06.678 TEST_HEADER include/spdk/json.h 00:03:06.678 TEST_HEADER include/spdk/iscsi_spec.h 00:03:06.678 TEST_HEADER include/spdk/jsonrpc.h 00:03:06.678 TEST_HEADER include/spdk/likely.h 00:03:06.678 TEST_HEADER include/spdk/log.h 00:03:06.678 TEST_HEADER include/spdk/lvol.h 00:03:06.678 TEST_HEADER include/spdk/memory.h 00:03:06.678 TEST_HEADER include/spdk/mmio.h 00:03:06.678 TEST_HEADER include/spdk/notify.h 00:03:06.678 TEST_HEADER include/spdk/nbd.h 00:03:06.678 TEST_HEADER include/spdk/nvme_intel.h 00:03:06.678 TEST_HEADER include/spdk/nvme.h 00:03:06.678 CC app/spdk_lspci/spdk_lspci.o 00:03:06.678 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:06.678 CC app/spdk_nvme_perf/perf.o 00:03:06.678 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:06.678 TEST_HEADER include/spdk/nvme_spec.h 00:03:06.678 TEST_HEADER include/spdk/nvme_zns.h 00:03:06.678 CC app/spdk_nvme_discover/discovery_aer.o 00:03:06.678 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:06.678 CC app/spdk_top/spdk_top.o 00:03:06.678 TEST_HEADER include/spdk/nvmf.h 00:03:06.678 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:06.678 TEST_HEADER include/spdk/nvmf_spec.h 00:03:06.678 TEST_HEADER include/spdk/nvmf_transport.h 00:03:06.678 TEST_HEADER include/spdk/opal.h 00:03:06.678 TEST_HEADER include/spdk/opal_spec.h 00:03:06.678 TEST_HEADER include/spdk/pci_ids.h 00:03:06.678 TEST_HEADER include/spdk/queue.h 00:03:06.678 TEST_HEADER include/spdk/pipe.h 00:03:06.678 TEST_HEADER include/spdk/reduce.h 00:03:06.678 TEST_HEADER include/spdk/rpc.h 00:03:06.678 TEST_HEADER include/spdk/scheduler.h 00:03:06.678 TEST_HEADER include/spdk/scsi.h 00:03:06.678 TEST_HEADER include/spdk/scsi_spec.h 00:03:06.678 TEST_HEADER include/spdk/sock.h 00:03:06.678 TEST_HEADER include/spdk/stdinc.h 00:03:06.678 TEST_HEADER include/spdk/string.h 00:03:06.678 TEST_HEADER include/spdk/thread.h 00:03:06.678 TEST_HEADER include/spdk/trace.h 00:03:06.678 TEST_HEADER include/spdk/trace_parser.h 00:03:06.678 TEST_HEADER include/spdk/tree.h 00:03:06.678 CC app/spdk_dd/spdk_dd.o 00:03:06.678 TEST_HEADER include/spdk/ublk.h 00:03:06.678 TEST_HEADER include/spdk/util.h 00:03:06.678 TEST_HEADER include/spdk/uuid.h 00:03:06.678 TEST_HEADER include/spdk/version.h 00:03:06.678 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:06.678 TEST_HEADER include/spdk/vhost.h 00:03:06.678 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:06.678 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:06.678 TEST_HEADER include/spdk/vmd.h 00:03:06.678 TEST_HEADER include/spdk/xor.h 00:03:06.678 TEST_HEADER include/spdk/zipf.h 00:03:06.678 CXX test/cpp_headers/accel.o 00:03:06.678 CXX test/cpp_headers/accel_module.o 00:03:06.678 CXX test/cpp_headers/assert.o 00:03:06.678 CXX test/cpp_headers/barrier.o 00:03:06.678 CC app/vhost/vhost.o 00:03:06.678 CXX test/cpp_headers/base64.o 00:03:06.678 CXX test/cpp_headers/bdev.o 00:03:06.678 CXX test/cpp_headers/bdev_module.o 00:03:06.678 CXX test/cpp_headers/bdev_zone.o 00:03:06.678 CXX test/cpp_headers/bit_array.o 00:03:06.678 CXX test/cpp_headers/bit_pool.o 00:03:06.678 CXX test/cpp_headers/blob_bdev.o 00:03:06.678 CXX test/cpp_headers/blobfs_bdev.o 00:03:06.678 CXX test/cpp_headers/blobfs.o 00:03:06.678 CXX test/cpp_headers/blob.o 00:03:06.678 CXX test/cpp_headers/conf.o 00:03:06.678 CXX test/cpp_headers/cpuset.o 00:03:06.678 CXX test/cpp_headers/config.o 00:03:06.678 CXX test/cpp_headers/crc16.o 00:03:06.678 CXX test/cpp_headers/crc32.o 00:03:06.678 CXX test/cpp_headers/crc64.o 00:03:06.678 CXX test/cpp_headers/dif.o 00:03:06.678 CXX test/cpp_headers/dma.o 00:03:06.678 CXX test/cpp_headers/endian.o 00:03:06.678 CXX test/cpp_headers/env_dpdk.o 00:03:06.678 CXX test/cpp_headers/env.o 00:03:06.678 CC test/env/pci/pci_ut.o 00:03:06.678 CXX test/cpp_headers/event.o 00:03:06.678 CXX test/cpp_headers/fd_group.o 00:03:06.678 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:06.678 CXX test/cpp_headers/fd.o 00:03:06.678 CXX test/cpp_headers/file.o 00:03:06.678 CXX test/cpp_headers/ftl.o 00:03:06.678 CXX test/cpp_headers/gpt_spec.o 00:03:06.678 CXX test/cpp_headers/hexlify.o 00:03:06.678 CXX test/cpp_headers/histogram_data.o 00:03:06.678 CXX test/cpp_headers/idxd.o 00:03:06.678 CXX test/cpp_headers/idxd_spec.o 00:03:06.678 CC app/nvmf_tgt/nvmf_main.o 00:03:06.678 CC test/env/memory/memory_ut.o 00:03:06.678 CXX test/cpp_headers/init.o 00:03:06.678 CC test/env/vtophys/vtophys.o 00:03:06.678 CC test/app/jsoncat/jsoncat.o 00:03:06.678 CC app/iscsi_tgt/iscsi_tgt.o 00:03:06.678 CC test/app/histogram_perf/histogram_perf.o 00:03:06.678 CC test/app/stub/stub.o 00:03:06.678 CC app/spdk_tgt/spdk_tgt.o 00:03:06.678 CC test/event/reactor/reactor.o 00:03:06.678 CC test/thread/poller_perf/poller_perf.o 00:03:06.678 CC test/event/event_perf/event_perf.o 00:03:06.678 CC test/dma/test_dma/test_dma.o 00:03:06.678 CC test/nvme/sgl/sgl.o 00:03:06.678 CC test/bdev/bdevio/bdevio.o 00:03:06.678 CC test/nvme/e2edp/nvme_dp.o 00:03:06.678 CC test/nvme/aer/aer.o 00:03:06.678 CC test/thread/lock/spdk_lock.o 00:03:06.678 CC test/nvme/overhead/overhead.o 00:03:06.678 CC test/event/reactor_perf/reactor_perf.o 00:03:06.678 CC test/nvme/simple_copy/simple_copy.o 00:03:06.678 CC test/nvme/err_injection/err_injection.o 00:03:06.678 CC test/nvme/boot_partition/boot_partition.o 00:03:06.678 CC examples/idxd/perf/perf.o 00:03:06.678 CC examples/ioat/verify/verify.o 00:03:06.678 CC examples/vmd/led/led.o 00:03:06.678 CC test/nvme/reserve/reserve.o 00:03:06.678 CC test/app/bdev_svc/bdev_svc.o 00:03:06.678 CC test/nvme/cuse/cuse.o 00:03:06.678 CC test/nvme/reset/reset.o 00:03:06.678 CC examples/vmd/lsvmd/lsvmd.o 00:03:06.678 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:06.678 CC examples/ioat/perf/perf.o 00:03:06.678 CC test/nvme/compliance/nvme_compliance.o 00:03:06.678 CC test/nvme/startup/startup.o 00:03:06.678 CC test/event/app_repeat/app_repeat.o 00:03:06.678 CC test/nvme/connect_stress/connect_stress.o 00:03:06.678 CC test/nvme/fdp/fdp.o 00:03:06.678 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:06.678 CC examples/nvme/hello_world/hello_world.o 00:03:06.678 CC test/nvme/fused_ordering/fused_ordering.o 00:03:06.678 CC examples/blob/cli/blobcli.o 00:03:06.678 CC examples/accel/perf/accel_perf.o 00:03:06.678 CC examples/nvme/abort/abort.o 00:03:06.678 CC app/fio/nvme/fio_plugin.o 00:03:06.678 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:06.678 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:06.678 CC examples/nvme/arbitration/arbitration.o 00:03:06.678 CC examples/nvme/hotplug/hotplug.o 00:03:06.678 CC examples/blob/hello_world/hello_blob.o 00:03:06.678 CC examples/util/zipf/zipf.o 00:03:06.678 CC examples/sock/hello_world/hello_sock.o 00:03:06.679 CC examples/nvme/reconnect/reconnect.o 00:03:06.679 CC test/accel/dif/dif.o 00:03:06.679 CC test/blobfs/mkfs/mkfs.o 00:03:06.679 CC test/event/scheduler/scheduler.o 00:03:06.679 CC test/env/mem_callbacks/mem_callbacks.o 00:03:06.679 CC examples/nvmf/nvmf/nvmf.o 00:03:06.679 CC examples/bdev/hello_world/hello_bdev.o 00:03:06.679 LINK spdk_lspci 00:03:06.679 CC examples/thread/thread/thread_ex.o 00:03:06.679 LINK rpc_client_test 00:03:06.944 CC app/fio/bdev/fio_plugin.o 00:03:06.944 CC examples/bdev/bdevperf/bdevperf.o 00:03:06.944 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:06.944 CC test/lvol/esnap/esnap.o 00:03:06.944 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:06.944 CXX test/cpp_headers/ioat.o 00:03:06.944 CXX test/cpp_headers/ioat_spec.o 00:03:06.944 CXX test/cpp_headers/iscsi_spec.o 00:03:06.944 LINK spdk_nvme_discover 00:03:06.944 LINK env_dpdk_post_init 00:03:06.944 LINK jsoncat 00:03:06.944 LINK vtophys 00:03:06.944 CXX test/cpp_headers/json.o 00:03:06.944 CXX test/cpp_headers/jsonrpc.o 00:03:06.944 CXX test/cpp_headers/likely.o 00:03:06.944 CXX test/cpp_headers/log.o 00:03:06.944 LINK histogram_perf 00:03:06.944 CXX test/cpp_headers/lvol.o 00:03:06.944 LINK interrupt_tgt 00:03:06.944 CXX test/cpp_headers/memory.o 00:03:06.945 CXX test/cpp_headers/mmio.o 00:03:06.945 CXX test/cpp_headers/nbd.o 00:03:06.945 CXX test/cpp_headers/notify.o 00:03:06.945 CXX test/cpp_headers/nvme.o 00:03:06.945 CXX test/cpp_headers/nvme_intel.o 00:03:06.945 LINK spdk_trace_record 00:03:06.945 CXX test/cpp_headers/nvme_ocssd.o 00:03:06.945 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:06.945 CXX test/cpp_headers/nvme_spec.o 00:03:06.945 LINK vhost 00:03:06.945 CXX test/cpp_headers/nvme_zns.o 00:03:06.945 CXX test/cpp_headers/nvmf_cmd.o 00:03:06.945 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:06.945 LINK reactor 00:03:06.945 CXX test/cpp_headers/nvmf.o 00:03:06.945 LINK event_perf 00:03:06.945 CXX test/cpp_headers/nvmf_spec.o 00:03:06.945 CXX test/cpp_headers/nvmf_transport.o 00:03:06.945 CXX test/cpp_headers/opal.o 00:03:06.945 CXX test/cpp_headers/opal_spec.o 00:03:06.945 CXX test/cpp_headers/pci_ids.o 00:03:06.945 LINK stub 00:03:06.945 CXX test/cpp_headers/pipe.o 00:03:06.945 CXX test/cpp_headers/queue.o 00:03:06.945 LINK poller_perf 00:03:06.945 CXX test/cpp_headers/reduce.o 00:03:06.945 CXX test/cpp_headers/rpc.o 00:03:06.945 LINK led 00:03:06.945 CXX test/cpp_headers/scheduler.o 00:03:06.945 LINK reactor_perf 00:03:06.945 LINK lsvmd 00:03:06.945 CXX test/cpp_headers/scsi.o 00:03:06.945 LINK nvmf_tgt 00:03:06.945 LINK app_repeat 00:03:06.945 LINK zipf 00:03:06.945 CXX test/cpp_headers/scsi_spec.o 00:03:06.945 LINK err_injection 00:03:06.945 LINK iscsi_tgt 00:03:06.945 LINK boot_partition 00:03:06.945 LINK connect_stress 00:03:06.945 LINK doorbell_aers 00:03:06.945 LINK bdev_svc 00:03:06.945 LINK startup 00:03:06.945 LINK spdk_tgt 00:03:06.945 LINK pmr_persistence 00:03:06.945 CXX test/cpp_headers/sock.o 00:03:06.945 LINK cmb_copy 00:03:06.945 LINK verify 00:03:06.945 LINK fused_ordering 00:03:06.945 LINK reserve 00:03:06.945 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:06.945 LINK simple_copy 00:03:06.945 LINK mkfs 00:03:06.945 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:03:06.945 struct spdk_nvme_fdp_ruhs ruhs; 00:03:06.945 ^ 00:03:06.945 LINK ioat_perf 00:03:06.945 LINK hello_world 00:03:07.205 LINK hotplug 00:03:07.205 LINK hello_blob 00:03:07.205 LINK hello_sock 00:03:07.205 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:07.205 LINK aer 00:03:07.205 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:07.205 LINK reset 00:03:07.205 LINK scheduler 00:03:07.205 LINK fdp 00:03:07.205 LINK sgl 00:03:07.205 CXX test/cpp_headers/stdinc.o 00:03:07.205 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:07.205 LINK nvme_dp 00:03:07.205 LINK overhead 00:03:07.205 CXX test/cpp_headers/string.o 00:03:07.205 CXX test/cpp_headers/thread.o 00:03:07.205 LINK spdk_trace 00:03:07.205 CXX test/cpp_headers/trace.o 00:03:07.205 LINK hello_bdev 00:03:07.205 CXX test/cpp_headers/trace_parser.o 00:03:07.205 CXX test/cpp_headers/tree.o 00:03:07.205 CXX test/cpp_headers/ublk.o 00:03:07.205 LINK thread 00:03:07.205 CXX test/cpp_headers/util.o 00:03:07.205 CXX test/cpp_headers/uuid.o 00:03:07.205 CXX test/cpp_headers/version.o 00:03:07.205 CXX test/cpp_headers/vfio_user_pci.o 00:03:07.205 CXX test/cpp_headers/vfio_user_spec.o 00:03:07.205 LINK nvmf 00:03:07.205 CXX test/cpp_headers/vhost.o 00:03:07.205 CXX test/cpp_headers/vmd.o 00:03:07.205 CXX test/cpp_headers/xor.o 00:03:07.205 CXX test/cpp_headers/zipf.o 00:03:07.205 LINK idxd_perf 00:03:07.205 LINK test_dma 00:03:07.205 LINK bdevio 00:03:07.205 LINK reconnect 00:03:07.205 LINK arbitration 00:03:07.205 LINK spdk_dd 00:03:07.205 LINK dif 00:03:07.205 LINK abort 00:03:07.205 LINK nvme_compliance 00:03:07.464 LINK pci_ut 00:03:07.464 LINK nvme_manage 00:03:07.464 LINK blobcli 00:03:07.464 LINK nvme_fuzz 00:03:07.464 LINK accel_perf 00:03:07.464 LINK mem_callbacks 00:03:07.464 LINK spdk_bdev 00:03:07.464 1 warning generated. 00:03:07.464 LINK llvm_vfio_fuzz 00:03:07.723 LINK spdk_nvme 00:03:07.723 LINK vhost_fuzz 00:03:07.723 LINK memory_ut 00:03:07.723 LINK spdk_nvme_identify 00:03:07.723 LINK bdevperf 00:03:07.723 LINK spdk_top 00:03:07.723 LINK spdk_nvme_perf 00:03:07.981 LINK cuse 00:03:07.981 LINK llvm_nvme_fuzz 00:03:08.240 LINK spdk_lock 00:03:08.498 LINK iscsi_fuzz 00:03:11.034 LINK esnap 00:03:11.034 00:03:11.034 real 0m23.853s 00:03:11.034 user 4m36.654s 00:03:11.034 sys 1m56.704s 00:03:11.035 06:45:40 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:11.035 06:45:40 -- common/autotest_common.sh@10 -- $ set +x 00:03:11.035 ************************************ 00:03:11.035 END TEST make 00:03:11.035 ************************************ 00:03:11.035 06:45:40 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:11.035 06:45:40 -- nvmf/common.sh@7 -- # uname -s 00:03:11.035 06:45:40 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:11.035 06:45:40 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:11.035 06:45:40 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:11.035 06:45:40 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:11.035 06:45:40 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:11.035 06:45:40 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:11.035 06:45:40 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:11.035 06:45:40 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:11.035 06:45:40 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:11.035 06:45:40 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:11.035 06:45:40 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:11.035 06:45:40 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:11.035 06:45:40 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:11.035 06:45:40 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:11.035 06:45:40 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:11.035 06:45:40 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:11.035 06:45:40 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:11.035 06:45:40 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:11.035 06:45:40 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:11.035 06:45:40 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:11.035 06:45:40 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:11.035 06:45:40 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:11.035 06:45:40 -- paths/export.sh@5 -- # export PATH 00:03:11.035 06:45:40 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:11.035 06:45:40 -- nvmf/common.sh@46 -- # : 0 00:03:11.035 06:45:40 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:11.035 06:45:40 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:11.035 06:45:40 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:11.035 06:45:40 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:11.035 06:45:40 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:11.035 06:45:40 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:11.035 06:45:40 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:11.035 06:45:40 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:11.035 06:45:40 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:11.035 06:45:40 -- spdk/autotest.sh@32 -- # uname -s 00:03:11.035 06:45:40 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:11.035 06:45:40 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:11.035 06:45:40 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:11.035 06:45:40 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:11.035 06:45:40 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:11.035 06:45:40 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:11.035 06:45:40 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:11.035 06:45:40 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:11.035 06:45:40 -- spdk/autotest.sh@48 -- # udevadm_pid=2549262 00:03:11.035 06:45:40 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:11.035 06:45:40 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:11.035 06:45:40 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:11.035 06:45:40 -- spdk/autotest.sh@54 -- # echo 2549264 00:03:11.035 06:45:40 -- spdk/autotest.sh@56 -- # echo 2549265 00:03:11.035 06:45:40 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:11.035 06:45:40 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:11.035 06:45:40 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:11.035 06:45:40 -- spdk/autotest.sh@60 -- # echo 2549266 00:03:11.035 06:45:40 -- spdk/autotest.sh@62 -- # echo 2549267 00:03:11.035 06:45:40 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:11.035 06:45:40 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:11.035 06:45:40 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:11.035 06:45:40 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:11.035 06:45:40 -- common/autotest_common.sh@10 -- # set +x 00:03:11.035 06:45:40 -- spdk/autotest.sh@70 -- # create_test_list 00:03:11.035 06:45:40 -- common/autotest_common.sh@736 -- # xtrace_disable 00:03:11.035 06:45:40 -- common/autotest_common.sh@10 -- # set +x 00:03:11.376 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:11.376 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:11.376 06:45:40 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:11.376 06:45:40 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:11.376 06:45:40 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:11.376 06:45:40 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:11.376 06:45:40 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:11.376 06:45:40 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:11.376 06:45:40 -- common/autotest_common.sh@1440 -- # uname 00:03:11.376 06:45:40 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:03:11.376 06:45:40 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:11.376 06:45:40 -- common/autotest_common.sh@1460 -- # uname 00:03:11.376 06:45:41 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:03:11.376 06:45:41 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:03:11.376 06:45:41 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:03:11.376 06:45:41 -- spdk/autotest.sh@83 -- # hash lcov 00:03:11.376 06:45:41 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:11.376 06:45:41 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:03:11.376 06:45:41 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:11.376 06:45:41 -- common/autotest_common.sh@10 -- # set +x 00:03:11.376 06:45:41 -- spdk/autotest.sh@102 -- # rm -f 00:03:11.376 06:45:41 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:14.675 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:14.675 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:14.675 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:14.675 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:14.675 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:14.675 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:14.675 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:14.675 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:14.934 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:15.193 06:45:44 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:15.193 06:45:44 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:15.193 06:45:44 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:15.193 06:45:44 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:15.193 06:45:44 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:15.193 06:45:44 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:15.193 06:45:44 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:15.193 06:45:44 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:15.193 06:45:44 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:15.193 06:45:44 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:15.193 06:45:44 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:15.193 06:45:44 -- spdk/autotest.sh@121 -- # grep -v p 00:03:15.193 06:45:44 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:15.193 06:45:44 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:15.193 06:45:44 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:15.193 06:45:44 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:15.193 06:45:44 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:15.193 No valid GPT data, bailing 00:03:15.193 06:45:44 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:15.193 06:45:44 -- scripts/common.sh@393 -- # pt= 00:03:15.193 06:45:44 -- scripts/common.sh@394 -- # return 1 00:03:15.193 06:45:44 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:15.193 1+0 records in 00:03:15.193 1+0 records out 00:03:15.193 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00229478 s, 457 MB/s 00:03:15.193 06:45:44 -- spdk/autotest.sh@129 -- # sync 00:03:15.193 06:45:44 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:15.193 06:45:44 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:15.193 06:45:44 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:23.316 06:45:51 -- spdk/autotest.sh@135 -- # uname -s 00:03:23.316 06:45:51 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:23.316 06:45:51 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:23.316 06:45:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:23.316 06:45:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:23.316 06:45:51 -- common/autotest_common.sh@10 -- # set +x 00:03:23.316 ************************************ 00:03:23.316 START TEST setup.sh 00:03:23.316 ************************************ 00:03:23.316 06:45:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:23.316 * Looking for test storage... 00:03:23.316 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:23.316 06:45:51 -- setup/test-setup.sh@10 -- # uname -s 00:03:23.316 06:45:51 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:23.316 06:45:51 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:23.316 06:45:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:23.316 06:45:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:23.316 06:45:51 -- common/autotest_common.sh@10 -- # set +x 00:03:23.316 ************************************ 00:03:23.316 START TEST acl 00:03:23.316 ************************************ 00:03:23.316 06:45:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:23.316 * Looking for test storage... 00:03:23.316 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:23.316 06:45:52 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:23.316 06:45:52 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:23.316 06:45:52 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:23.316 06:45:52 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:23.316 06:45:52 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:23.316 06:45:52 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:23.316 06:45:52 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:23.316 06:45:52 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:23.316 06:45:52 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:23.316 06:45:52 -- setup/acl.sh@12 -- # devs=() 00:03:23.316 06:45:52 -- setup/acl.sh@12 -- # declare -a devs 00:03:23.316 06:45:52 -- setup/acl.sh@13 -- # drivers=() 00:03:23.316 06:45:52 -- setup/acl.sh@13 -- # declare -A drivers 00:03:23.316 06:45:52 -- setup/acl.sh@51 -- # setup reset 00:03:23.316 06:45:52 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:23.316 06:45:52 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:25.842 06:45:55 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:25.842 06:45:55 -- setup/acl.sh@16 -- # local dev driver 00:03:25.842 06:45:55 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:25.842 06:45:55 -- setup/acl.sh@15 -- # setup output status 00:03:25.842 06:45:55 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:25.842 06:45:55 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:29.128 Hugepages 00:03:29.128 node hugesize free / total 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 00:03:29.128 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:58 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:58 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:58 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:59 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:29.128 06:45:59 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:59 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:59 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:29.128 06:45:59 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:59 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:59 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:29.128 06:45:59 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:59 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.128 06:45:59 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:29.128 06:45:59 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.128 06:45:59 -- setup/acl.sh@20 -- # continue 00:03:29.128 06:45:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.387 06:45:59 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:29.387 06:45:59 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.387 06:45:59 -- setup/acl.sh@20 -- # continue 00:03:29.387 06:45:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.387 06:45:59 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:29.387 06:45:59 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:29.387 06:45:59 -- setup/acl.sh@20 -- # continue 00:03:29.387 06:45:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.387 06:45:59 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:29.387 06:45:59 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:29.387 06:45:59 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:29.387 06:45:59 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:29.387 06:45:59 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:29.387 06:45:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:29.387 06:45:59 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:29.387 06:45:59 -- setup/acl.sh@54 -- # run_test denied denied 00:03:29.387 06:45:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:29.387 06:45:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:29.387 06:45:59 -- common/autotest_common.sh@10 -- # set +x 00:03:29.387 ************************************ 00:03:29.387 START TEST denied 00:03:29.387 ************************************ 00:03:29.387 06:45:59 -- common/autotest_common.sh@1104 -- # denied 00:03:29.387 06:45:59 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:29.387 06:45:59 -- setup/acl.sh@38 -- # setup output config 00:03:29.387 06:45:59 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:29.387 06:45:59 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:29.387 06:45:59 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:33.576 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:33.576 06:46:02 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:33.576 06:46:02 -- setup/acl.sh@28 -- # local dev driver 00:03:33.576 06:46:02 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:33.576 06:46:02 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:33.576 06:46:02 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:33.576 06:46:02 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:33.576 06:46:02 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:33.576 06:46:02 -- setup/acl.sh@41 -- # setup reset 00:03:33.576 06:46:02 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:33.576 06:46:02 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:37.768 00:03:37.768 real 0m8.478s 00:03:37.768 user 0m2.791s 00:03:37.768 sys 0m5.056s 00:03:37.768 06:46:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:37.768 06:46:07 -- common/autotest_common.sh@10 -- # set +x 00:03:37.768 ************************************ 00:03:37.768 END TEST denied 00:03:37.768 ************************************ 00:03:38.026 06:46:07 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:38.026 06:46:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:38.026 06:46:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:38.026 06:46:07 -- common/autotest_common.sh@10 -- # set +x 00:03:38.026 ************************************ 00:03:38.026 START TEST allowed 00:03:38.026 ************************************ 00:03:38.026 06:46:07 -- common/autotest_common.sh@1104 -- # allowed 00:03:38.026 06:46:07 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:38.026 06:46:07 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:38.026 06:46:07 -- setup/acl.sh@45 -- # setup output config 00:03:38.026 06:46:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:38.026 06:46:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:43.302 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:43.302 06:46:12 -- setup/acl.sh@47 -- # verify 00:03:43.302 06:46:12 -- setup/acl.sh@28 -- # local dev driver 00:03:43.302 06:46:12 -- setup/acl.sh@48 -- # setup reset 00:03:43.302 06:46:12 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:43.302 06:46:12 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:46.593 00:03:46.593 real 0m8.549s 00:03:46.593 user 0m2.289s 00:03:46.593 sys 0m4.744s 00:03:46.593 06:46:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.593 06:46:16 -- common/autotest_common.sh@10 -- # set +x 00:03:46.593 ************************************ 00:03:46.593 END TEST allowed 00:03:46.593 ************************************ 00:03:46.593 00:03:46.593 real 0m24.325s 00:03:46.593 user 0m7.657s 00:03:46.593 sys 0m14.739s 00:03:46.593 06:46:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.593 06:46:16 -- common/autotest_common.sh@10 -- # set +x 00:03:46.593 ************************************ 00:03:46.593 END TEST acl 00:03:46.593 ************************************ 00:03:46.593 06:46:16 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:46.593 06:46:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:46.593 06:46:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:46.593 06:46:16 -- common/autotest_common.sh@10 -- # set +x 00:03:46.593 ************************************ 00:03:46.593 START TEST hugepages 00:03:46.593 ************************************ 00:03:46.594 06:46:16 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:46.594 * Looking for test storage... 00:03:46.594 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:46.594 06:46:16 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:46.594 06:46:16 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:46.594 06:46:16 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:46.594 06:46:16 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:46.594 06:46:16 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:46.594 06:46:16 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:46.594 06:46:16 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:46.594 06:46:16 -- setup/common.sh@18 -- # local node= 00:03:46.594 06:46:16 -- setup/common.sh@19 -- # local var val 00:03:46.594 06:46:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:46.594 06:46:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.594 06:46:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.594 06:46:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.594 06:46:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.594 06:46:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 40348952 kB' 'MemAvailable: 42087820 kB' 'Buffers: 4476 kB' 'Cached: 11419704 kB' 'SwapCached: 8 kB' 'Active: 10492132 kB' 'Inactive: 1585260 kB' 'Active(anon): 9964436 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 656548 kB' 'Mapped: 219112 kB' 'Shmem: 9322248 kB' 'KReclaimable: 267716 kB' 'Slab: 846400 kB' 'SReclaimable: 267716 kB' 'SUnreclaim: 578684 kB' 'KernelStack: 21984 kB' 'PageTables: 9008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439056 kB' 'Committed_AS: 11298496 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215828 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.594 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.594 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # continue 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.595 06:46:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.595 06:46:16 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:46.595 06:46:16 -- setup/common.sh@33 -- # echo 2048 00:03:46.595 06:46:16 -- setup/common.sh@33 -- # return 0 00:03:46.595 06:46:16 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:46.595 06:46:16 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:46.595 06:46:16 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:46.595 06:46:16 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:46.595 06:46:16 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:46.595 06:46:16 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:46.595 06:46:16 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:46.595 06:46:16 -- setup/hugepages.sh@207 -- # get_nodes 00:03:46.595 06:46:16 -- setup/hugepages.sh@27 -- # local node 00:03:46.595 06:46:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.595 06:46:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:46.595 06:46:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.595 06:46:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:46.595 06:46:16 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:46.595 06:46:16 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:46.595 06:46:16 -- setup/hugepages.sh@208 -- # clear_hp 00:03:46.595 06:46:16 -- setup/hugepages.sh@37 -- # local node hp 00:03:46.595 06:46:16 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:46.595 06:46:16 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:46.595 06:46:16 -- setup/hugepages.sh@41 -- # echo 0 00:03:46.595 06:46:16 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:46.595 06:46:16 -- setup/hugepages.sh@41 -- # echo 0 00:03:46.595 06:46:16 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:46.595 06:46:16 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:46.595 06:46:16 -- setup/hugepages.sh@41 -- # echo 0 00:03:46.595 06:46:16 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:46.595 06:46:16 -- setup/hugepages.sh@41 -- # echo 0 00:03:46.595 06:46:16 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:46.595 06:46:16 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:46.595 06:46:16 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:46.595 06:46:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:46.595 06:46:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:46.595 06:46:16 -- common/autotest_common.sh@10 -- # set +x 00:03:46.855 ************************************ 00:03:46.855 START TEST default_setup 00:03:46.855 ************************************ 00:03:46.855 06:46:16 -- common/autotest_common.sh@1104 -- # default_setup 00:03:46.855 06:46:16 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:46.855 06:46:16 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:46.855 06:46:16 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:46.855 06:46:16 -- setup/hugepages.sh@51 -- # shift 00:03:46.855 06:46:16 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:46.855 06:46:16 -- setup/hugepages.sh@52 -- # local node_ids 00:03:46.855 06:46:16 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:46.855 06:46:16 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:46.855 06:46:16 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:46.855 06:46:16 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:46.855 06:46:16 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:46.855 06:46:16 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:46.855 06:46:16 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:46.855 06:46:16 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:46.855 06:46:16 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:46.855 06:46:16 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:46.855 06:46:16 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:46.855 06:46:16 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:46.855 06:46:16 -- setup/hugepages.sh@73 -- # return 0 00:03:46.855 06:46:16 -- setup/hugepages.sh@137 -- # setup output 00:03:46.855 06:46:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.855 06:46:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:50.148 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:50.148 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:50.407 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:50.407 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:50.407 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:51.787 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:51.787 06:46:21 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:51.787 06:46:21 -- setup/hugepages.sh@89 -- # local node 00:03:51.787 06:46:21 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:51.787 06:46:21 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:51.787 06:46:21 -- setup/hugepages.sh@92 -- # local surp 00:03:51.787 06:46:21 -- setup/hugepages.sh@93 -- # local resv 00:03:51.787 06:46:21 -- setup/hugepages.sh@94 -- # local anon 00:03:51.787 06:46:21 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:51.787 06:46:21 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:51.787 06:46:21 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:51.787 06:46:21 -- setup/common.sh@18 -- # local node= 00:03:51.787 06:46:21 -- setup/common.sh@19 -- # local var val 00:03:51.787 06:46:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:51.787 06:46:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.787 06:46:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.787 06:46:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.787 06:46:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.787 06:46:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42586080 kB' 'MemAvailable: 44324932 kB' 'Buffers: 4476 kB' 'Cached: 11419828 kB' 'SwapCached: 8 kB' 'Active: 10500716 kB' 'Inactive: 1585260 kB' 'Active(anon): 9973020 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 665132 kB' 'Mapped: 218808 kB' 'Shmem: 9322372 kB' 'KReclaimable: 267684 kB' 'Slab: 845032 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577348 kB' 'KernelStack: 21984 kB' 'PageTables: 8976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11306592 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215892 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.787 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.787 06:46:21 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # continue 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.788 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.788 06:46:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.788 06:46:21 -- setup/common.sh@33 -- # echo 0 00:03:51.788 06:46:21 -- setup/common.sh@33 -- # return 0 00:03:51.788 06:46:21 -- setup/hugepages.sh@97 -- # anon=0 00:03:51.788 06:46:21 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:52.051 06:46:21 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.051 06:46:21 -- setup/common.sh@18 -- # local node= 00:03:52.051 06:46:21 -- setup/common.sh@19 -- # local var val 00:03:52.051 06:46:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.051 06:46:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.051 06:46:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.051 06:46:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.051 06:46:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.051 06:46:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42585540 kB' 'MemAvailable: 44324392 kB' 'Buffers: 4476 kB' 'Cached: 11419832 kB' 'SwapCached: 8 kB' 'Active: 10500240 kB' 'Inactive: 1585260 kB' 'Active(anon): 9972544 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 664712 kB' 'Mapped: 218756 kB' 'Shmem: 9322376 kB' 'KReclaimable: 267684 kB' 'Slab: 845012 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577328 kB' 'KernelStack: 22128 kB' 'PageTables: 9512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11306604 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215924 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.051 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.051 06:46:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.052 06:46:21 -- setup/common.sh@33 -- # echo 0 00:03:52.052 06:46:21 -- setup/common.sh@33 -- # return 0 00:03:52.052 06:46:21 -- setup/hugepages.sh@99 -- # surp=0 00:03:52.052 06:46:21 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:52.052 06:46:21 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:52.052 06:46:21 -- setup/common.sh@18 -- # local node= 00:03:52.052 06:46:21 -- setup/common.sh@19 -- # local var val 00:03:52.052 06:46:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.052 06:46:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.052 06:46:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.052 06:46:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.052 06:46:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.052 06:46:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.052 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.052 06:46:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42584368 kB' 'MemAvailable: 44323220 kB' 'Buffers: 4476 kB' 'Cached: 11419844 kB' 'SwapCached: 8 kB' 'Active: 10501344 kB' 'Inactive: 1585260 kB' 'Active(anon): 9973648 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 665928 kB' 'Mapped: 218756 kB' 'Shmem: 9322388 kB' 'KReclaimable: 267684 kB' 'Slab: 845012 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577328 kB' 'KernelStack: 22208 kB' 'PageTables: 9220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11305228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216020 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.052 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.053 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.053 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.054 06:46:21 -- setup/common.sh@33 -- # echo 0 00:03:52.054 06:46:21 -- setup/common.sh@33 -- # return 0 00:03:52.054 06:46:21 -- setup/hugepages.sh@100 -- # resv=0 00:03:52.054 06:46:21 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:52.054 nr_hugepages=1024 00:03:52.054 06:46:21 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:52.054 resv_hugepages=0 00:03:52.054 06:46:21 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:52.054 surplus_hugepages=0 00:03:52.054 06:46:21 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:52.054 anon_hugepages=0 00:03:52.054 06:46:21 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:52.054 06:46:21 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:52.054 06:46:21 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:52.054 06:46:21 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:52.054 06:46:21 -- setup/common.sh@18 -- # local node= 00:03:52.054 06:46:21 -- setup/common.sh@19 -- # local var val 00:03:52.054 06:46:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.054 06:46:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.054 06:46:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.054 06:46:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.054 06:46:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.054 06:46:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42583884 kB' 'MemAvailable: 44322736 kB' 'Buffers: 4476 kB' 'Cached: 11419856 kB' 'SwapCached: 8 kB' 'Active: 10500980 kB' 'Inactive: 1585260 kB' 'Active(anon): 9973284 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 665336 kB' 'Mapped: 218756 kB' 'Shmem: 9322400 kB' 'KReclaimable: 267684 kB' 'Slab: 844980 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577296 kB' 'KernelStack: 22256 kB' 'PageTables: 9264 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11306636 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216036 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.054 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.054 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.055 06:46:21 -- setup/common.sh@33 -- # echo 1024 00:03:52.055 06:46:21 -- setup/common.sh@33 -- # return 0 00:03:52.055 06:46:21 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:52.055 06:46:21 -- setup/hugepages.sh@112 -- # get_nodes 00:03:52.055 06:46:21 -- setup/hugepages.sh@27 -- # local node 00:03:52.055 06:46:21 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.055 06:46:21 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:52.055 06:46:21 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.055 06:46:21 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:52.055 06:46:21 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:52.055 06:46:21 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:52.055 06:46:21 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:52.055 06:46:21 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:52.055 06:46:21 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:52.055 06:46:21 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.055 06:46:21 -- setup/common.sh@18 -- # local node=0 00:03:52.055 06:46:21 -- setup/common.sh@19 -- # local var val 00:03:52.055 06:46:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.055 06:46:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.055 06:46:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:52.055 06:46:21 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:52.055 06:46:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.055 06:46:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25190396 kB' 'MemUsed: 7401688 kB' 'SwapCached: 8 kB' 'Active: 3919068 kB' 'Inactive: 187304 kB' 'Active(anon): 3621900 kB' 'Inactive(anon): 8 kB' 'Active(file): 297168 kB' 'Inactive(file): 187296 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3956088 kB' 'Mapped: 167608 kB' 'AnonPages: 154080 kB' 'Shmem: 3471616 kB' 'KernelStack: 12088 kB' 'PageTables: 4728 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124480 kB' 'Slab: 390620 kB' 'SReclaimable: 124480 kB' 'SUnreclaim: 266140 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.055 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.055 06:46:21 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # continue 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.056 06:46:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.056 06:46:21 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.056 06:46:21 -- setup/common.sh@33 -- # echo 0 00:03:52.056 06:46:21 -- setup/common.sh@33 -- # return 0 00:03:52.056 06:46:21 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:52.056 06:46:21 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:52.057 06:46:21 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:52.057 06:46:21 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:52.057 06:46:21 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:52.057 node0=1024 expecting 1024 00:03:52.057 06:46:21 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:52.057 00:03:52.057 real 0m5.306s 00:03:52.057 user 0m1.446s 00:03:52.057 sys 0m2.466s 00:03:52.057 06:46:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:52.057 06:46:21 -- common/autotest_common.sh@10 -- # set +x 00:03:52.057 ************************************ 00:03:52.057 END TEST default_setup 00:03:52.057 ************************************ 00:03:52.057 06:46:21 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:52.057 06:46:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:52.057 06:46:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:52.057 06:46:21 -- common/autotest_common.sh@10 -- # set +x 00:03:52.057 ************************************ 00:03:52.057 START TEST per_node_1G_alloc 00:03:52.057 ************************************ 00:03:52.057 06:46:21 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:03:52.057 06:46:21 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:52.057 06:46:21 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:52.057 06:46:21 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:52.057 06:46:21 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:52.057 06:46:21 -- setup/hugepages.sh@51 -- # shift 00:03:52.057 06:46:21 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:52.057 06:46:21 -- setup/hugepages.sh@52 -- # local node_ids 00:03:52.057 06:46:21 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:52.057 06:46:21 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:52.057 06:46:21 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:52.057 06:46:21 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:52.057 06:46:21 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:52.057 06:46:21 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:52.057 06:46:21 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:52.057 06:46:21 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:52.057 06:46:21 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:52.057 06:46:21 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:52.057 06:46:21 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:52.057 06:46:21 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:52.057 06:46:21 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:52.057 06:46:21 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:52.057 06:46:21 -- setup/hugepages.sh@73 -- # return 0 00:03:52.057 06:46:21 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:52.057 06:46:21 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:52.057 06:46:21 -- setup/hugepages.sh@146 -- # setup output 00:03:52.057 06:46:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:52.057 06:46:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:55.348 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:55.348 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:55.348 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:55.348 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:55.348 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:55.348 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:55.348 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:55.348 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:55.610 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:55.611 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:55.611 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:55.611 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:55.611 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:55.611 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:55.611 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:55.611 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:55.611 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:55.611 06:46:25 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:55.611 06:46:25 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:55.611 06:46:25 -- setup/hugepages.sh@89 -- # local node 00:03:55.611 06:46:25 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:55.611 06:46:25 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:55.611 06:46:25 -- setup/hugepages.sh@92 -- # local surp 00:03:55.611 06:46:25 -- setup/hugepages.sh@93 -- # local resv 00:03:55.611 06:46:25 -- setup/hugepages.sh@94 -- # local anon 00:03:55.611 06:46:25 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:55.611 06:46:25 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:55.611 06:46:25 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:55.611 06:46:25 -- setup/common.sh@18 -- # local node= 00:03:55.611 06:46:25 -- setup/common.sh@19 -- # local var val 00:03:55.611 06:46:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.611 06:46:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.611 06:46:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.611 06:46:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.611 06:46:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.611 06:46:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42609188 kB' 'MemAvailable: 44348040 kB' 'Buffers: 4476 kB' 'Cached: 11419964 kB' 'SwapCached: 8 kB' 'Active: 10500228 kB' 'Inactive: 1585260 kB' 'Active(anon): 9972532 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 664372 kB' 'Mapped: 218768 kB' 'Shmem: 9322508 kB' 'KReclaimable: 267684 kB' 'Slab: 844532 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 576848 kB' 'KernelStack: 22000 kB' 'PageTables: 8832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11303376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216036 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.611 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.611 06:46:25 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.612 06:46:25 -- setup/common.sh@33 -- # echo 0 00:03:55.612 06:46:25 -- setup/common.sh@33 -- # return 0 00:03:55.612 06:46:25 -- setup/hugepages.sh@97 -- # anon=0 00:03:55.612 06:46:25 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:55.612 06:46:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:55.612 06:46:25 -- setup/common.sh@18 -- # local node= 00:03:55.612 06:46:25 -- setup/common.sh@19 -- # local var val 00:03:55.612 06:46:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.612 06:46:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.612 06:46:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.612 06:46:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.612 06:46:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.612 06:46:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42609124 kB' 'MemAvailable: 44347976 kB' 'Buffers: 4476 kB' 'Cached: 11419964 kB' 'SwapCached: 8 kB' 'Active: 10500436 kB' 'Inactive: 1585260 kB' 'Active(anon): 9972740 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 664656 kB' 'Mapped: 218760 kB' 'Shmem: 9322508 kB' 'KReclaimable: 267684 kB' 'Slab: 844592 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 576908 kB' 'KernelStack: 22000 kB' 'PageTables: 8844 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11303388 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216020 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.612 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.612 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.613 06:46:25 -- setup/common.sh@33 -- # echo 0 00:03:55.613 06:46:25 -- setup/common.sh@33 -- # return 0 00:03:55.613 06:46:25 -- setup/hugepages.sh@99 -- # surp=0 00:03:55.613 06:46:25 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:55.613 06:46:25 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:55.613 06:46:25 -- setup/common.sh@18 -- # local node= 00:03:55.613 06:46:25 -- setup/common.sh@19 -- # local var val 00:03:55.613 06:46:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.613 06:46:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.613 06:46:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.613 06:46:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.613 06:46:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.613 06:46:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.613 06:46:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42621176 kB' 'MemAvailable: 44360028 kB' 'Buffers: 4476 kB' 'Cached: 11419976 kB' 'SwapCached: 8 kB' 'Active: 10498896 kB' 'Inactive: 1585260 kB' 'Active(anon): 9971200 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 663104 kB' 'Mapped: 217544 kB' 'Shmem: 9322520 kB' 'KReclaimable: 267684 kB' 'Slab: 844556 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 576872 kB' 'KernelStack: 21968 kB' 'PageTables: 8696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11296096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215988 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.613 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.613 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.614 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.614 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.615 06:46:25 -- setup/common.sh@33 -- # echo 0 00:03:55.615 06:46:25 -- setup/common.sh@33 -- # return 0 00:03:55.615 06:46:25 -- setup/hugepages.sh@100 -- # resv=0 00:03:55.615 06:46:25 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:55.615 nr_hugepages=1024 00:03:55.615 06:46:25 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:55.615 resv_hugepages=0 00:03:55.615 06:46:25 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:55.615 surplus_hugepages=0 00:03:55.615 06:46:25 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:55.615 anon_hugepages=0 00:03:55.615 06:46:25 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:55.615 06:46:25 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:55.615 06:46:25 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:55.615 06:46:25 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:55.615 06:46:25 -- setup/common.sh@18 -- # local node= 00:03:55.615 06:46:25 -- setup/common.sh@19 -- # local var val 00:03:55.615 06:46:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.615 06:46:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.615 06:46:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.615 06:46:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.615 06:46:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.615 06:46:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42623708 kB' 'MemAvailable: 44362560 kB' 'Buffers: 4476 kB' 'Cached: 11419992 kB' 'SwapCached: 8 kB' 'Active: 10498928 kB' 'Inactive: 1585260 kB' 'Active(anon): 9971232 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 663104 kB' 'Mapped: 217544 kB' 'Shmem: 9322536 kB' 'KReclaimable: 267684 kB' 'Slab: 844552 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 576868 kB' 'KernelStack: 21968 kB' 'PageTables: 8696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11296112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216004 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.615 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.615 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.877 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.877 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.878 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.878 06:46:25 -- setup/common.sh@33 -- # echo 1024 00:03:55.878 06:46:25 -- setup/common.sh@33 -- # return 0 00:03:55.878 06:46:25 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:55.878 06:46:25 -- setup/hugepages.sh@112 -- # get_nodes 00:03:55.878 06:46:25 -- setup/hugepages.sh@27 -- # local node 00:03:55.878 06:46:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:55.878 06:46:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:55.878 06:46:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:55.878 06:46:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:55.878 06:46:25 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:55.878 06:46:25 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:55.878 06:46:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:55.878 06:46:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:55.878 06:46:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:55.878 06:46:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:55.878 06:46:25 -- setup/common.sh@18 -- # local node=0 00:03:55.878 06:46:25 -- setup/common.sh@19 -- # local var val 00:03:55.878 06:46:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.878 06:46:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.878 06:46:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:55.878 06:46:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:55.878 06:46:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.878 06:46:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.878 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26258724 kB' 'MemUsed: 6333360 kB' 'SwapCached: 8 kB' 'Active: 3918636 kB' 'Inactive: 187304 kB' 'Active(anon): 3621468 kB' 'Inactive(anon): 8 kB' 'Active(file): 297168 kB' 'Inactive(file): 187296 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3956148 kB' 'Mapped: 166400 kB' 'AnonPages: 153084 kB' 'Shmem: 3471676 kB' 'KernelStack: 11864 kB' 'PageTables: 4240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124480 kB' 'Slab: 390364 kB' 'SReclaimable: 124480 kB' 'SUnreclaim: 265884 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.879 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.879 06:46:25 -- setup/common.sh@33 -- # echo 0 00:03:55.879 06:46:25 -- setup/common.sh@33 -- # return 0 00:03:55.879 06:46:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:55.879 06:46:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:55.879 06:46:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:55.879 06:46:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:55.879 06:46:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:55.879 06:46:25 -- setup/common.sh@18 -- # local node=1 00:03:55.879 06:46:25 -- setup/common.sh@19 -- # local var val 00:03:55.879 06:46:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.879 06:46:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.879 06:46:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:55.879 06:46:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:55.879 06:46:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.879 06:46:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.879 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703128 kB' 'MemFree: 16364984 kB' 'MemUsed: 11338144 kB' 'SwapCached: 0 kB' 'Active: 6580328 kB' 'Inactive: 1397956 kB' 'Active(anon): 6349800 kB' 'Inactive(anon): 11016 kB' 'Active(file): 230528 kB' 'Inactive(file): 1386940 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7468356 kB' 'Mapped: 51144 kB' 'AnonPages: 510016 kB' 'Shmem: 5850888 kB' 'KernelStack: 10104 kB' 'PageTables: 4456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 143204 kB' 'Slab: 454188 kB' 'SReclaimable: 143204 kB' 'SUnreclaim: 310984 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # continue 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.880 06:46:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.880 06:46:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.880 06:46:25 -- setup/common.sh@33 -- # echo 0 00:03:55.880 06:46:25 -- setup/common.sh@33 -- # return 0 00:03:55.880 06:46:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:55.880 06:46:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:55.880 06:46:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:55.880 06:46:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:55.881 06:46:25 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:55.881 node0=512 expecting 512 00:03:55.881 06:46:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:55.881 06:46:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:55.881 06:46:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:55.881 06:46:25 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:55.881 node1=512 expecting 512 00:03:55.881 06:46:25 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:55.881 00:03:55.881 real 0m3.737s 00:03:55.881 user 0m1.405s 00:03:55.881 sys 0m2.394s 00:03:55.881 06:46:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:55.881 06:46:25 -- common/autotest_common.sh@10 -- # set +x 00:03:55.881 ************************************ 00:03:55.881 END TEST per_node_1G_alloc 00:03:55.881 ************************************ 00:03:55.881 06:46:25 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:55.881 06:46:25 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:55.881 06:46:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:55.881 06:46:25 -- common/autotest_common.sh@10 -- # set +x 00:03:55.881 ************************************ 00:03:55.881 START TEST even_2G_alloc 00:03:55.881 ************************************ 00:03:55.881 06:46:25 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:03:55.881 06:46:25 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:55.881 06:46:25 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:55.881 06:46:25 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:55.881 06:46:25 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:55.881 06:46:25 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:55.881 06:46:25 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:55.881 06:46:25 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:55.881 06:46:25 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:55.881 06:46:25 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:55.881 06:46:25 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:55.881 06:46:25 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:55.881 06:46:25 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:55.881 06:46:25 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:55.881 06:46:25 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:55.881 06:46:25 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:55.881 06:46:25 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:55.881 06:46:25 -- setup/hugepages.sh@83 -- # : 512 00:03:55.881 06:46:25 -- setup/hugepages.sh@84 -- # : 1 00:03:55.881 06:46:25 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:55.881 06:46:25 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:55.881 06:46:25 -- setup/hugepages.sh@83 -- # : 0 00:03:55.881 06:46:25 -- setup/hugepages.sh@84 -- # : 0 00:03:55.881 06:46:25 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:55.881 06:46:25 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:55.881 06:46:25 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:55.881 06:46:25 -- setup/hugepages.sh@153 -- # setup output 00:03:55.881 06:46:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:55.881 06:46:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:59.175 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:59.175 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:59.439 06:46:29 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:59.439 06:46:29 -- setup/hugepages.sh@89 -- # local node 00:03:59.439 06:46:29 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:59.439 06:46:29 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:59.439 06:46:29 -- setup/hugepages.sh@92 -- # local surp 00:03:59.439 06:46:29 -- setup/hugepages.sh@93 -- # local resv 00:03:59.439 06:46:29 -- setup/hugepages.sh@94 -- # local anon 00:03:59.439 06:46:29 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:59.439 06:46:29 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:59.439 06:46:29 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:59.439 06:46:29 -- setup/common.sh@18 -- # local node= 00:03:59.439 06:46:29 -- setup/common.sh@19 -- # local var val 00:03:59.439 06:46:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.439 06:46:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.439 06:46:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.439 06:46:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.439 06:46:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.439 06:46:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42697916 kB' 'MemAvailable: 44436768 kB' 'Buffers: 4476 kB' 'Cached: 11420092 kB' 'SwapCached: 8 kB' 'Active: 10504540 kB' 'Inactive: 1585260 kB' 'Active(anon): 9976844 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 668608 kB' 'Mapped: 218060 kB' 'Shmem: 9322636 kB' 'KReclaimable: 267684 kB' 'Slab: 844736 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577052 kB' 'KernelStack: 21984 kB' 'PageTables: 8748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11302544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215928 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.439 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.439 06:46:29 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 06:46:29 -- setup/common.sh@33 -- # echo 0 00:03:59.440 06:46:29 -- setup/common.sh@33 -- # return 0 00:03:59.440 06:46:29 -- setup/hugepages.sh@97 -- # anon=0 00:03:59.440 06:46:29 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:59.440 06:46:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.440 06:46:29 -- setup/common.sh@18 -- # local node= 00:03:59.440 06:46:29 -- setup/common.sh@19 -- # local var val 00:03:59.440 06:46:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.440 06:46:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.440 06:46:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.440 06:46:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.440 06:46:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.440 06:46:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42700448 kB' 'MemAvailable: 44439300 kB' 'Buffers: 4476 kB' 'Cached: 11420092 kB' 'SwapCached: 8 kB' 'Active: 10499376 kB' 'Inactive: 1585260 kB' 'Active(anon): 9971680 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 663460 kB' 'Mapped: 217556 kB' 'Shmem: 9322636 kB' 'KReclaimable: 267684 kB' 'Slab: 844712 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577028 kB' 'KernelStack: 21952 kB' 'PageTables: 8624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11296436 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215892 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.440 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 06:46:29 -- setup/common.sh@33 -- # echo 0 00:03:59.442 06:46:29 -- setup/common.sh@33 -- # return 0 00:03:59.442 06:46:29 -- setup/hugepages.sh@99 -- # surp=0 00:03:59.442 06:46:29 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:59.442 06:46:29 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:59.442 06:46:29 -- setup/common.sh@18 -- # local node= 00:03:59.442 06:46:29 -- setup/common.sh@19 -- # local var val 00:03:59.442 06:46:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.442 06:46:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.442 06:46:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.442 06:46:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.442 06:46:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.442 06:46:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.442 06:46:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42702040 kB' 'MemAvailable: 44440892 kB' 'Buffers: 4476 kB' 'Cached: 11420104 kB' 'SwapCached: 8 kB' 'Active: 10499900 kB' 'Inactive: 1585260 kB' 'Active(anon): 9972204 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 664052 kB' 'Mapped: 217556 kB' 'Shmem: 9322648 kB' 'KReclaimable: 267684 kB' 'Slab: 844728 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577044 kB' 'KernelStack: 21984 kB' 'PageTables: 8752 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11299236 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215860 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 06:46:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 06:46:29 -- setup/common.sh@33 -- # echo 0 00:03:59.443 06:46:29 -- setup/common.sh@33 -- # return 0 00:03:59.443 06:46:29 -- setup/hugepages.sh@100 -- # resv=0 00:03:59.443 06:46:29 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:59.443 nr_hugepages=1024 00:03:59.443 06:46:29 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:59.443 resv_hugepages=0 00:03:59.443 06:46:29 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:59.443 surplus_hugepages=0 00:03:59.443 06:46:29 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:59.443 anon_hugepages=0 00:03:59.443 06:46:29 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.443 06:46:29 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:59.443 06:46:29 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:59.443 06:46:29 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:59.443 06:46:29 -- setup/common.sh@18 -- # local node= 00:03:59.443 06:46:29 -- setup/common.sh@19 -- # local var val 00:03:59.443 06:46:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.443 06:46:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.443 06:46:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.443 06:46:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.443 06:46:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.443 06:46:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42701536 kB' 'MemAvailable: 44440388 kB' 'Buffers: 4476 kB' 'Cached: 11420116 kB' 'SwapCached: 8 kB' 'Active: 10499704 kB' 'Inactive: 1585260 kB' 'Active(anon): 9972008 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 663756 kB' 'Mapped: 217560 kB' 'Shmem: 9322660 kB' 'KReclaimable: 267684 kB' 'Slab: 844728 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577044 kB' 'KernelStack: 21936 kB' 'PageTables: 8628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11299620 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215892 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.443 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 06:46:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 06:46:29 -- setup/common.sh@33 -- # echo 1024 00:03:59.445 06:46:29 -- setup/common.sh@33 -- # return 0 00:03:59.445 06:46:29 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.445 06:46:29 -- setup/hugepages.sh@112 -- # get_nodes 00:03:59.445 06:46:29 -- setup/hugepages.sh@27 -- # local node 00:03:59.445 06:46:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.445 06:46:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:59.445 06:46:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.445 06:46:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:59.445 06:46:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:59.445 06:46:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:59.445 06:46:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:59.445 06:46:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:59.445 06:46:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:59.445 06:46:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.445 06:46:29 -- setup/common.sh@18 -- # local node=0 00:03:59.445 06:46:29 -- setup/common.sh@19 -- # local var val 00:03:59.445 06:46:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.445 06:46:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.445 06:46:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:59.445 06:46:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:59.445 06:46:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.445 06:46:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26295332 kB' 'MemUsed: 6296752 kB' 'SwapCached: 8 kB' 'Active: 3919688 kB' 'Inactive: 187304 kB' 'Active(anon): 3622520 kB' 'Inactive(anon): 8 kB' 'Active(file): 297168 kB' 'Inactive(file): 187296 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3956196 kB' 'Mapped: 166412 kB' 'AnonPages: 154104 kB' 'Shmem: 3471724 kB' 'KernelStack: 11912 kB' 'PageTables: 4044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124480 kB' 'Slab: 390376 kB' 'SReclaimable: 124480 kB' 'SUnreclaim: 265896 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.445 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@33 -- # echo 0 00:03:59.446 06:46:29 -- setup/common.sh@33 -- # return 0 00:03:59.446 06:46:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:59.446 06:46:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:59.446 06:46:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:59.446 06:46:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:59.446 06:46:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.446 06:46:29 -- setup/common.sh@18 -- # local node=1 00:03:59.446 06:46:29 -- setup/common.sh@19 -- # local var val 00:03:59.446 06:46:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.446 06:46:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.446 06:46:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:59.446 06:46:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:59.446 06:46:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.446 06:46:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703128 kB' 'MemFree: 16405524 kB' 'MemUsed: 11297604 kB' 'SwapCached: 0 kB' 'Active: 6580512 kB' 'Inactive: 1397956 kB' 'Active(anon): 6349984 kB' 'Inactive(anon): 11016 kB' 'Active(file): 230528 kB' 'Inactive(file): 1386940 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7468436 kB' 'Mapped: 51148 kB' 'AnonPages: 510072 kB' 'Shmem: 5850968 kB' 'KernelStack: 10104 kB' 'PageTables: 4452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 143204 kB' 'Slab: 454352 kB' 'SReclaimable: 143204 kB' 'SUnreclaim: 311148 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.446 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # continue 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 06:46:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 06:46:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 06:46:29 -- setup/common.sh@33 -- # echo 0 00:03:59.447 06:46:29 -- setup/common.sh@33 -- # return 0 00:03:59.447 06:46:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:59.447 06:46:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:59.447 06:46:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:59.447 06:46:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:59.447 06:46:29 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:59.447 node0=512 expecting 512 00:03:59.447 06:46:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:59.447 06:46:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:59.447 06:46:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:59.447 06:46:29 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:59.447 node1=512 expecting 512 00:03:59.447 06:46:29 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:59.447 00:03:59.447 real 0m3.676s 00:03:59.447 user 0m1.398s 00:03:59.447 sys 0m2.334s 00:03:59.447 06:46:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:59.447 06:46:29 -- common/autotest_common.sh@10 -- # set +x 00:03:59.447 ************************************ 00:03:59.447 END TEST even_2G_alloc 00:03:59.447 ************************************ 00:03:59.705 06:46:29 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:59.705 06:46:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:59.705 06:46:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:59.705 06:46:29 -- common/autotest_common.sh@10 -- # set +x 00:03:59.705 ************************************ 00:03:59.706 START TEST odd_alloc 00:03:59.706 ************************************ 00:03:59.706 06:46:29 -- common/autotest_common.sh@1104 -- # odd_alloc 00:03:59.706 06:46:29 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:59.706 06:46:29 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:59.706 06:46:29 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:59.706 06:46:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:59.706 06:46:29 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:59.706 06:46:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:59.706 06:46:29 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:59.706 06:46:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:59.706 06:46:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:59.706 06:46:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:59.706 06:46:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:59.706 06:46:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:59.706 06:46:29 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:59.706 06:46:29 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:59.706 06:46:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:59.706 06:46:29 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:59.706 06:46:29 -- setup/hugepages.sh@83 -- # : 513 00:03:59.706 06:46:29 -- setup/hugepages.sh@84 -- # : 1 00:03:59.706 06:46:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:59.706 06:46:29 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:59.706 06:46:29 -- setup/hugepages.sh@83 -- # : 0 00:03:59.706 06:46:29 -- setup/hugepages.sh@84 -- # : 0 00:03:59.706 06:46:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:59.706 06:46:29 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:59.706 06:46:29 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:59.706 06:46:29 -- setup/hugepages.sh@160 -- # setup output 00:03:59.706 06:46:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:59.706 06:46:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:03.095 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:03.095 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:03.095 06:46:32 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:03.095 06:46:32 -- setup/hugepages.sh@89 -- # local node 00:04:03.095 06:46:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:03.095 06:46:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:03.095 06:46:32 -- setup/hugepages.sh@92 -- # local surp 00:04:03.095 06:46:32 -- setup/hugepages.sh@93 -- # local resv 00:04:03.095 06:46:32 -- setup/hugepages.sh@94 -- # local anon 00:04:03.095 06:46:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:03.095 06:46:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:03.095 06:46:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:03.095 06:46:32 -- setup/common.sh@18 -- # local node= 00:04:03.095 06:46:32 -- setup/common.sh@19 -- # local var val 00:04:03.095 06:46:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.095 06:46:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.095 06:46:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.095 06:46:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.095 06:46:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.095 06:46:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.095 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.095 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42733968 kB' 'MemAvailable: 44472820 kB' 'Buffers: 4476 kB' 'Cached: 11420224 kB' 'SwapCached: 8 kB' 'Active: 10499156 kB' 'Inactive: 1585260 kB' 'Active(anon): 9971460 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 662492 kB' 'Mapped: 217688 kB' 'Shmem: 9322768 kB' 'KReclaimable: 267684 kB' 'Slab: 845764 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 578080 kB' 'KernelStack: 22032 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486608 kB' 'Committed_AS: 11301400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216180 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.096 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.096 06:46:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.096 06:46:32 -- setup/common.sh@33 -- # echo 0 00:04:03.096 06:46:32 -- setup/common.sh@33 -- # return 0 00:04:03.097 06:46:32 -- setup/hugepages.sh@97 -- # anon=0 00:04:03.097 06:46:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:03.097 06:46:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.097 06:46:32 -- setup/common.sh@18 -- # local node= 00:04:03.097 06:46:32 -- setup/common.sh@19 -- # local var val 00:04:03.097 06:46:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.097 06:46:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.097 06:46:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.097 06:46:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.097 06:46:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.097 06:46:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.097 06:46:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42735384 kB' 'MemAvailable: 44474236 kB' 'Buffers: 4476 kB' 'Cached: 11420228 kB' 'SwapCached: 8 kB' 'Active: 10499020 kB' 'Inactive: 1585260 kB' 'Active(anon): 9971324 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 662468 kB' 'Mapped: 217640 kB' 'Shmem: 9322772 kB' 'KReclaimable: 267684 kB' 'Slab: 845796 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 578112 kB' 'KernelStack: 22160 kB' 'PageTables: 8812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486608 kB' 'Committed_AS: 11301164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216164 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.097 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.097 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.098 06:46:32 -- setup/common.sh@33 -- # echo 0 00:04:03.098 06:46:32 -- setup/common.sh@33 -- # return 0 00:04:03.098 06:46:32 -- setup/hugepages.sh@99 -- # surp=0 00:04:03.098 06:46:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:03.098 06:46:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:03.098 06:46:32 -- setup/common.sh@18 -- # local node= 00:04:03.098 06:46:32 -- setup/common.sh@19 -- # local var val 00:04:03.098 06:46:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.098 06:46:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.098 06:46:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.098 06:46:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.098 06:46:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.098 06:46:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42735164 kB' 'MemAvailable: 44474016 kB' 'Buffers: 4476 kB' 'Cached: 11420240 kB' 'SwapCached: 8 kB' 'Active: 10498440 kB' 'Inactive: 1585260 kB' 'Active(anon): 9970744 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 661772 kB' 'Mapped: 217564 kB' 'Shmem: 9322784 kB' 'KReclaimable: 267684 kB' 'Slab: 845768 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 578084 kB' 'KernelStack: 22144 kB' 'PageTables: 9012 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486608 kB' 'Committed_AS: 11301428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216116 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.098 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.098 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.099 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.099 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.099 06:46:32 -- setup/common.sh@33 -- # echo 0 00:04:03.099 06:46:32 -- setup/common.sh@33 -- # return 0 00:04:03.099 06:46:32 -- setup/hugepages.sh@100 -- # resv=0 00:04:03.099 06:46:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:03.099 nr_hugepages=1025 00:04:03.099 06:46:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:03.099 resv_hugepages=0 00:04:03.099 06:46:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:03.099 surplus_hugepages=0 00:04:03.099 06:46:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:03.099 anon_hugepages=0 00:04:03.099 06:46:32 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:03.099 06:46:32 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:03.099 06:46:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:03.099 06:46:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:03.099 06:46:32 -- setup/common.sh@18 -- # local node= 00:04:03.099 06:46:32 -- setup/common.sh@19 -- # local var val 00:04:03.099 06:46:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.099 06:46:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.099 06:46:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.099 06:46:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.099 06:46:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.100 06:46:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42734248 kB' 'MemAvailable: 44473100 kB' 'Buffers: 4476 kB' 'Cached: 11420240 kB' 'SwapCached: 8 kB' 'Active: 10498496 kB' 'Inactive: 1585260 kB' 'Active(anon): 9970800 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 662320 kB' 'Mapped: 217564 kB' 'Shmem: 9322784 kB' 'KReclaimable: 267684 kB' 'Slab: 845768 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 578084 kB' 'KernelStack: 22080 kB' 'PageTables: 8988 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486608 kB' 'Committed_AS: 11301440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216164 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.100 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.100 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.101 06:46:32 -- setup/common.sh@33 -- # echo 1025 00:04:03.101 06:46:32 -- setup/common.sh@33 -- # return 0 00:04:03.101 06:46:32 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:03.101 06:46:32 -- setup/hugepages.sh@112 -- # get_nodes 00:04:03.101 06:46:32 -- setup/hugepages.sh@27 -- # local node 00:04:03.101 06:46:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.101 06:46:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:03.101 06:46:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.101 06:46:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:03.101 06:46:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:03.101 06:46:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:03.101 06:46:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.101 06:46:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.101 06:46:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:03.101 06:46:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.101 06:46:32 -- setup/common.sh@18 -- # local node=0 00:04:03.101 06:46:32 -- setup/common.sh@19 -- # local var val 00:04:03.101 06:46:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.101 06:46:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.101 06:46:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:03.101 06:46:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:03.101 06:46:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.101 06:46:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26304584 kB' 'MemUsed: 6287500 kB' 'SwapCached: 8 kB' 'Active: 3919612 kB' 'Inactive: 187304 kB' 'Active(anon): 3622444 kB' 'Inactive(anon): 8 kB' 'Active(file): 297168 kB' 'Inactive(file): 187296 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3956236 kB' 'Mapped: 166416 kB' 'AnonPages: 153784 kB' 'Shmem: 3471764 kB' 'KernelStack: 11848 kB' 'PageTables: 4008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124480 kB' 'Slab: 391348 kB' 'SReclaimable: 124480 kB' 'SUnreclaim: 266868 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.101 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.101 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@33 -- # echo 0 00:04:03.102 06:46:32 -- setup/common.sh@33 -- # return 0 00:04:03.102 06:46:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.102 06:46:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.102 06:46:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.102 06:46:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:03.102 06:46:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.102 06:46:32 -- setup/common.sh@18 -- # local node=1 00:04:03.102 06:46:32 -- setup/common.sh@19 -- # local var val 00:04:03.102 06:46:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.102 06:46:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.102 06:46:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:03.102 06:46:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:03.102 06:46:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.102 06:46:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703128 kB' 'MemFree: 16430212 kB' 'MemUsed: 11272916 kB' 'SwapCached: 0 kB' 'Active: 6578140 kB' 'Inactive: 1397956 kB' 'Active(anon): 6347612 kB' 'Inactive(anon): 11016 kB' 'Active(file): 230528 kB' 'Inactive(file): 1386940 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7468528 kB' 'Mapped: 51144 kB' 'AnonPages: 507744 kB' 'Shmem: 5851060 kB' 'KernelStack: 10136 kB' 'PageTables: 4504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 143204 kB' 'Slab: 454420 kB' 'SReclaimable: 143204 kB' 'SUnreclaim: 311216 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.102 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.102 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # continue 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.103 06:46:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.103 06:46:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.103 06:46:32 -- setup/common.sh@33 -- # echo 0 00:04:03.103 06:46:32 -- setup/common.sh@33 -- # return 0 00:04:03.103 06:46:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.103 06:46:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.103 06:46:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.103 06:46:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.103 06:46:32 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:03.103 node0=512 expecting 513 00:04:03.103 06:46:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.103 06:46:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.103 06:46:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.103 06:46:32 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:03.103 node1=513 expecting 512 00:04:03.103 06:46:32 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:03.103 00:04:03.103 real 0m3.528s 00:04:03.103 user 0m1.316s 00:04:03.103 sys 0m2.255s 00:04:03.103 06:46:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:03.103 06:46:32 -- common/autotest_common.sh@10 -- # set +x 00:04:03.103 ************************************ 00:04:03.103 END TEST odd_alloc 00:04:03.103 ************************************ 00:04:03.103 06:46:32 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:03.103 06:46:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:03.103 06:46:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:03.103 06:46:32 -- common/autotest_common.sh@10 -- # set +x 00:04:03.103 ************************************ 00:04:03.103 START TEST custom_alloc 00:04:03.103 ************************************ 00:04:03.103 06:46:32 -- common/autotest_common.sh@1104 -- # custom_alloc 00:04:03.103 06:46:32 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:03.103 06:46:32 -- setup/hugepages.sh@169 -- # local node 00:04:03.103 06:46:32 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:03.103 06:46:32 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:03.103 06:46:32 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:03.103 06:46:32 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:03.103 06:46:32 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:03.103 06:46:32 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:03.103 06:46:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:03.103 06:46:32 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:03.103 06:46:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:03.103 06:46:32 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:03.103 06:46:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:03.103 06:46:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:03.103 06:46:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:03.103 06:46:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:03.103 06:46:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:03.104 06:46:32 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:03.104 06:46:32 -- setup/hugepages.sh@83 -- # : 256 00:04:03.104 06:46:32 -- setup/hugepages.sh@84 -- # : 1 00:04:03.104 06:46:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:03.104 06:46:32 -- setup/hugepages.sh@83 -- # : 0 00:04:03.104 06:46:32 -- setup/hugepages.sh@84 -- # : 0 00:04:03.104 06:46:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:03.104 06:46:32 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:03.104 06:46:32 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:03.104 06:46:32 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:03.104 06:46:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:03.104 06:46:32 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:03.104 06:46:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:03.104 06:46:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:03.104 06:46:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:03.104 06:46:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:03.104 06:46:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:03.104 06:46:32 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:03.104 06:46:32 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:03.104 06:46:32 -- setup/hugepages.sh@78 -- # return 0 00:04:03.104 06:46:32 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:03.104 06:46:32 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:03.104 06:46:32 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:03.104 06:46:32 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:03.104 06:46:32 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:03.104 06:46:32 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:03.104 06:46:32 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:03.104 06:46:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:03.104 06:46:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:03.104 06:46:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:03.104 06:46:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:03.104 06:46:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:03.104 06:46:32 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:03.104 06:46:32 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:03.104 06:46:32 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:03.104 06:46:32 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:03.104 06:46:32 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:03.104 06:46:32 -- setup/hugepages.sh@78 -- # return 0 00:04:03.104 06:46:32 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:03.104 06:46:32 -- setup/hugepages.sh@187 -- # setup output 00:04:03.104 06:46:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:03.104 06:46:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:07.304 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:07.304 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:07.304 06:46:36 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:07.304 06:46:36 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:07.304 06:46:36 -- setup/hugepages.sh@89 -- # local node 00:04:07.304 06:46:36 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:07.304 06:46:36 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:07.304 06:46:36 -- setup/hugepages.sh@92 -- # local surp 00:04:07.304 06:46:36 -- setup/hugepages.sh@93 -- # local resv 00:04:07.304 06:46:36 -- setup/hugepages.sh@94 -- # local anon 00:04:07.304 06:46:36 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:07.304 06:46:36 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:07.304 06:46:36 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:07.304 06:46:36 -- setup/common.sh@18 -- # local node= 00:04:07.304 06:46:36 -- setup/common.sh@19 -- # local var val 00:04:07.304 06:46:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.304 06:46:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.304 06:46:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.304 06:46:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.304 06:46:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.304 06:46:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41730968 kB' 'MemAvailable: 43469820 kB' 'Buffers: 4476 kB' 'Cached: 11420372 kB' 'SwapCached: 8 kB' 'Active: 10498688 kB' 'Inactive: 1585260 kB' 'Active(anon): 9970992 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 662568 kB' 'Mapped: 217592 kB' 'Shmem: 9322916 kB' 'KReclaimable: 267684 kB' 'Slab: 845012 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577328 kB' 'KernelStack: 22032 kB' 'PageTables: 8824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963344 kB' 'Committed_AS: 11297828 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215956 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.304 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.304 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.305 06:46:36 -- setup/common.sh@33 -- # echo 0 00:04:07.305 06:46:36 -- setup/common.sh@33 -- # return 0 00:04:07.305 06:46:36 -- setup/hugepages.sh@97 -- # anon=0 00:04:07.305 06:46:36 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:07.305 06:46:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.305 06:46:36 -- setup/common.sh@18 -- # local node= 00:04:07.305 06:46:36 -- setup/common.sh@19 -- # local var val 00:04:07.305 06:46:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.305 06:46:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.305 06:46:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.305 06:46:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.305 06:46:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.305 06:46:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41733488 kB' 'MemAvailable: 43472340 kB' 'Buffers: 4476 kB' 'Cached: 11420376 kB' 'SwapCached: 8 kB' 'Active: 10498504 kB' 'Inactive: 1585260 kB' 'Active(anon): 9970808 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 662392 kB' 'Mapped: 217592 kB' 'Shmem: 9322920 kB' 'KReclaimable: 267684 kB' 'Slab: 845004 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577320 kB' 'KernelStack: 22000 kB' 'PageTables: 8688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963344 kB' 'Committed_AS: 11298040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215908 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.305 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.305 06:46:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.306 06:46:36 -- setup/common.sh@33 -- # echo 0 00:04:07.306 06:46:36 -- setup/common.sh@33 -- # return 0 00:04:07.306 06:46:36 -- setup/hugepages.sh@99 -- # surp=0 00:04:07.306 06:46:36 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:07.306 06:46:36 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:07.306 06:46:36 -- setup/common.sh@18 -- # local node= 00:04:07.306 06:46:36 -- setup/common.sh@19 -- # local var val 00:04:07.306 06:46:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.306 06:46:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.306 06:46:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.306 06:46:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.306 06:46:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.306 06:46:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41734616 kB' 'MemAvailable: 43473468 kB' 'Buffers: 4476 kB' 'Cached: 11420388 kB' 'SwapCached: 8 kB' 'Active: 10498512 kB' 'Inactive: 1585260 kB' 'Active(anon): 9970816 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 662360 kB' 'Mapped: 217568 kB' 'Shmem: 9322932 kB' 'KReclaimable: 267684 kB' 'Slab: 844980 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577296 kB' 'KernelStack: 22000 kB' 'PageTables: 8696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963344 kB' 'Committed_AS: 11298056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215908 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.306 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.306 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.307 06:46:36 -- setup/common.sh@33 -- # echo 0 00:04:07.307 06:46:36 -- setup/common.sh@33 -- # return 0 00:04:07.307 06:46:36 -- setup/hugepages.sh@100 -- # resv=0 00:04:07.307 06:46:36 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:07.307 nr_hugepages=1536 00:04:07.307 06:46:36 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:07.307 resv_hugepages=0 00:04:07.307 06:46:36 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:07.307 surplus_hugepages=0 00:04:07.307 06:46:36 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:07.307 anon_hugepages=0 00:04:07.307 06:46:36 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:07.307 06:46:36 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:07.307 06:46:36 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:07.307 06:46:36 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:07.307 06:46:36 -- setup/common.sh@18 -- # local node= 00:04:07.307 06:46:36 -- setup/common.sh@19 -- # local var val 00:04:07.307 06:46:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.307 06:46:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.307 06:46:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.307 06:46:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.307 06:46:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.307 06:46:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41734716 kB' 'MemAvailable: 43473568 kB' 'Buffers: 4476 kB' 'Cached: 11420388 kB' 'SwapCached: 8 kB' 'Active: 10498512 kB' 'Inactive: 1585260 kB' 'Active(anon): 9970816 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 662360 kB' 'Mapped: 217568 kB' 'Shmem: 9322932 kB' 'KReclaimable: 267684 kB' 'Slab: 844980 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577296 kB' 'KernelStack: 22000 kB' 'PageTables: 8696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963344 kB' 'Committed_AS: 11298068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215908 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.307 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.307 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.308 06:46:36 -- setup/common.sh@33 -- # echo 1536 00:04:07.308 06:46:36 -- setup/common.sh@33 -- # return 0 00:04:07.308 06:46:36 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:07.308 06:46:36 -- setup/hugepages.sh@112 -- # get_nodes 00:04:07.308 06:46:36 -- setup/hugepages.sh@27 -- # local node 00:04:07.308 06:46:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.308 06:46:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:07.308 06:46:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.308 06:46:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:07.308 06:46:36 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:07.308 06:46:36 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:07.308 06:46:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.308 06:46:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.308 06:46:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:07.308 06:46:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.308 06:46:36 -- setup/common.sh@18 -- # local node=0 00:04:07.308 06:46:36 -- setup/common.sh@19 -- # local var val 00:04:07.308 06:46:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.308 06:46:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.308 06:46:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:07.308 06:46:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:07.308 06:46:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.308 06:46:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26339888 kB' 'MemUsed: 6252196 kB' 'SwapCached: 8 kB' 'Active: 3920504 kB' 'Inactive: 187304 kB' 'Active(anon): 3623336 kB' 'Inactive(anon): 8 kB' 'Active(file): 297168 kB' 'Inactive(file): 187296 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3956296 kB' 'Mapped: 166424 kB' 'AnonPages: 154816 kB' 'Shmem: 3471824 kB' 'KernelStack: 11880 kB' 'PageTables: 4196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124480 kB' 'Slab: 390972 kB' 'SReclaimable: 124480 kB' 'SUnreclaim: 266492 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@33 -- # echo 0 00:04:07.308 06:46:36 -- setup/common.sh@33 -- # return 0 00:04:07.308 06:46:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.308 06:46:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.308 06:46:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.308 06:46:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:07.308 06:46:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.308 06:46:36 -- setup/common.sh@18 -- # local node=1 00:04:07.308 06:46:36 -- setup/common.sh@19 -- # local var val 00:04:07.308 06:46:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.308 06:46:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.308 06:46:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:07.308 06:46:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:07.308 06:46:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.308 06:46:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.308 06:46:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703128 kB' 'MemFree: 15395132 kB' 'MemUsed: 12307996 kB' 'SwapCached: 0 kB' 'Active: 6578068 kB' 'Inactive: 1397956 kB' 'Active(anon): 6347540 kB' 'Inactive(anon): 11016 kB' 'Active(file): 230528 kB' 'Inactive(file): 1386940 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7468616 kB' 'Mapped: 51144 kB' 'AnonPages: 507540 kB' 'Shmem: 5851148 kB' 'KernelStack: 10120 kB' 'PageTables: 4500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 143204 kB' 'Slab: 454008 kB' 'SReclaimable: 143204 kB' 'SUnreclaim: 310804 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.308 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.308 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # continue 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.309 06:46:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.309 06:46:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.309 06:46:36 -- setup/common.sh@33 -- # echo 0 00:04:07.309 06:46:36 -- setup/common.sh@33 -- # return 0 00:04:07.309 06:46:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.309 06:46:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.309 06:46:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.309 06:46:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.309 06:46:36 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:07.309 node0=512 expecting 512 00:04:07.309 06:46:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.309 06:46:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.309 06:46:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.309 06:46:36 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:07.309 node1=1024 expecting 1024 00:04:07.309 06:46:36 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:07.309 00:04:07.309 real 0m3.712s 00:04:07.309 user 0m1.430s 00:04:07.309 sys 0m2.340s 00:04:07.309 06:46:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:07.309 06:46:36 -- common/autotest_common.sh@10 -- # set +x 00:04:07.309 ************************************ 00:04:07.309 END TEST custom_alloc 00:04:07.309 ************************************ 00:04:07.309 06:46:36 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:07.309 06:46:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:07.309 06:46:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:07.309 06:46:36 -- common/autotest_common.sh@10 -- # set +x 00:04:07.309 ************************************ 00:04:07.309 START TEST no_shrink_alloc 00:04:07.309 ************************************ 00:04:07.309 06:46:36 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:04:07.309 06:46:36 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:07.309 06:46:36 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:07.309 06:46:36 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:07.309 06:46:36 -- setup/hugepages.sh@51 -- # shift 00:04:07.309 06:46:36 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:07.309 06:46:36 -- setup/hugepages.sh@52 -- # local node_ids 00:04:07.309 06:46:36 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:07.309 06:46:36 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:07.309 06:46:36 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:07.309 06:46:36 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:07.309 06:46:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:07.309 06:46:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:07.309 06:46:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:07.309 06:46:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:07.309 06:46:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:07.309 06:46:36 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:07.309 06:46:36 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:07.309 06:46:36 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:07.309 06:46:36 -- setup/hugepages.sh@73 -- # return 0 00:04:07.309 06:46:36 -- setup/hugepages.sh@198 -- # setup output 00:04:07.309 06:46:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:07.309 06:46:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:10.602 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:10.602 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:10.602 06:46:40 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:10.602 06:46:40 -- setup/hugepages.sh@89 -- # local node 00:04:10.602 06:46:40 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:10.602 06:46:40 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:10.602 06:46:40 -- setup/hugepages.sh@92 -- # local surp 00:04:10.602 06:46:40 -- setup/hugepages.sh@93 -- # local resv 00:04:10.602 06:46:40 -- setup/hugepages.sh@94 -- # local anon 00:04:10.602 06:46:40 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:10.602 06:46:40 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:10.602 06:46:40 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:10.603 06:46:40 -- setup/common.sh@18 -- # local node= 00:04:10.603 06:46:40 -- setup/common.sh@19 -- # local var val 00:04:10.603 06:46:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.603 06:46:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.603 06:46:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.603 06:46:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.603 06:46:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.603 06:46:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42813556 kB' 'MemAvailable: 44552408 kB' 'Buffers: 4476 kB' 'Cached: 11420504 kB' 'SwapCached: 8 kB' 'Active: 10502636 kB' 'Inactive: 1585260 kB' 'Active(anon): 9974940 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 666584 kB' 'Mapped: 217628 kB' 'Shmem: 9323048 kB' 'KReclaimable: 267684 kB' 'Slab: 844932 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577248 kB' 'KernelStack: 22176 kB' 'PageTables: 8560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11303260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216116 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.603 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.603 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.604 06:46:40 -- setup/common.sh@33 -- # echo 0 00:04:10.604 06:46:40 -- setup/common.sh@33 -- # return 0 00:04:10.604 06:46:40 -- setup/hugepages.sh@97 -- # anon=0 00:04:10.604 06:46:40 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:10.604 06:46:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.604 06:46:40 -- setup/common.sh@18 -- # local node= 00:04:10.604 06:46:40 -- setup/common.sh@19 -- # local var val 00:04:10.604 06:46:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.604 06:46:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.604 06:46:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.604 06:46:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.604 06:46:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.604 06:46:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42814536 kB' 'MemAvailable: 44553388 kB' 'Buffers: 4476 kB' 'Cached: 11420508 kB' 'SwapCached: 8 kB' 'Active: 10502940 kB' 'Inactive: 1585260 kB' 'Active(anon): 9975244 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 666940 kB' 'Mapped: 217628 kB' 'Shmem: 9323052 kB' 'KReclaimable: 267684 kB' 'Slab: 844928 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577244 kB' 'KernelStack: 22144 kB' 'PageTables: 8864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11303404 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216100 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.604 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.604 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.605 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.605 06:46:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.606 06:46:40 -- setup/common.sh@33 -- # echo 0 00:04:10.606 06:46:40 -- setup/common.sh@33 -- # return 0 00:04:10.606 06:46:40 -- setup/hugepages.sh@99 -- # surp=0 00:04:10.606 06:46:40 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:10.606 06:46:40 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:10.606 06:46:40 -- setup/common.sh@18 -- # local node= 00:04:10.606 06:46:40 -- setup/common.sh@19 -- # local var val 00:04:10.606 06:46:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.606 06:46:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.606 06:46:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.606 06:46:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.606 06:46:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.606 06:46:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42813980 kB' 'MemAvailable: 44552832 kB' 'Buffers: 4476 kB' 'Cached: 11420520 kB' 'SwapCached: 8 kB' 'Active: 10502352 kB' 'Inactive: 1585260 kB' 'Active(anon): 9974656 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 666860 kB' 'Mapped: 217704 kB' 'Shmem: 9323064 kB' 'KReclaimable: 267684 kB' 'Slab: 845048 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577364 kB' 'KernelStack: 22224 kB' 'PageTables: 9300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11307548 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216084 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.606 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.606 06:46:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.607 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.607 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.608 06:46:40 -- setup/common.sh@33 -- # echo 0 00:04:10.608 06:46:40 -- setup/common.sh@33 -- # return 0 00:04:10.608 06:46:40 -- setup/hugepages.sh@100 -- # resv=0 00:04:10.608 06:46:40 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:10.608 nr_hugepages=1024 00:04:10.608 06:46:40 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:10.608 resv_hugepages=0 00:04:10.608 06:46:40 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:10.608 surplus_hugepages=0 00:04:10.608 06:46:40 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:10.608 anon_hugepages=0 00:04:10.608 06:46:40 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:10.608 06:46:40 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:10.608 06:46:40 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:10.608 06:46:40 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:10.608 06:46:40 -- setup/common.sh@18 -- # local node= 00:04:10.608 06:46:40 -- setup/common.sh@19 -- # local var val 00:04:10.608 06:46:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.608 06:46:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.608 06:46:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.608 06:46:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.608 06:46:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.608 06:46:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42819128 kB' 'MemAvailable: 44557980 kB' 'Buffers: 4476 kB' 'Cached: 11420532 kB' 'SwapCached: 8 kB' 'Active: 10502108 kB' 'Inactive: 1585260 kB' 'Active(anon): 9974412 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 666012 kB' 'Mapped: 217628 kB' 'Shmem: 9323076 kB' 'KReclaimable: 267684 kB' 'Slab: 845044 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 577360 kB' 'KernelStack: 22144 kB' 'PageTables: 9224 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11303432 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216068 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.608 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.608 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.609 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.609 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.610 06:46:40 -- setup/common.sh@33 -- # echo 1024 00:04:10.610 06:46:40 -- setup/common.sh@33 -- # return 0 00:04:10.610 06:46:40 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:10.610 06:46:40 -- setup/hugepages.sh@112 -- # get_nodes 00:04:10.610 06:46:40 -- setup/hugepages.sh@27 -- # local node 00:04:10.610 06:46:40 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:10.610 06:46:40 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:10.610 06:46:40 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:10.610 06:46:40 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:10.610 06:46:40 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:10.610 06:46:40 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:10.610 06:46:40 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:10.610 06:46:40 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:10.610 06:46:40 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:10.610 06:46:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.610 06:46:40 -- setup/common.sh@18 -- # local node=0 00:04:10.610 06:46:40 -- setup/common.sh@19 -- # local var val 00:04:10.610 06:46:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.610 06:46:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.610 06:46:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:10.610 06:46:40 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:10.610 06:46:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.610 06:46:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25308680 kB' 'MemUsed: 7283404 kB' 'SwapCached: 8 kB' 'Active: 3922576 kB' 'Inactive: 187304 kB' 'Active(anon): 3625408 kB' 'Inactive(anon): 8 kB' 'Active(file): 297168 kB' 'Inactive(file): 187296 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3956396 kB' 'Mapped: 166428 kB' 'AnonPages: 156964 kB' 'Shmem: 3471924 kB' 'KernelStack: 11944 kB' 'PageTables: 4448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124480 kB' 'Slab: 390984 kB' 'SReclaimable: 124480 kB' 'SUnreclaim: 266504 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.610 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.610 06:46:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # continue 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.611 06:46:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.611 06:46:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.611 06:46:40 -- setup/common.sh@33 -- # echo 0 00:04:10.611 06:46:40 -- setup/common.sh@33 -- # return 0 00:04:10.611 06:46:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:10.611 06:46:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:10.611 06:46:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:10.611 06:46:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:10.611 06:46:40 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:10.611 node0=1024 expecting 1024 00:04:10.611 06:46:40 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:10.611 06:46:40 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:10.611 06:46:40 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:10.611 06:46:40 -- setup/hugepages.sh@202 -- # setup output 00:04:10.611 06:46:40 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:10.611 06:46:40 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:13.911 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.911 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:13.911 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:13.911 06:46:43 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:13.911 06:46:43 -- setup/hugepages.sh@89 -- # local node 00:04:13.911 06:46:43 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:13.911 06:46:43 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:13.911 06:46:43 -- setup/hugepages.sh@92 -- # local surp 00:04:13.911 06:46:43 -- setup/hugepages.sh@93 -- # local resv 00:04:13.911 06:46:43 -- setup/hugepages.sh@94 -- # local anon 00:04:13.911 06:46:43 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:13.911 06:46:43 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:13.911 06:46:43 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:13.911 06:46:43 -- setup/common.sh@18 -- # local node= 00:04:13.911 06:46:43 -- setup/common.sh@19 -- # local var val 00:04:13.911 06:46:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.911 06:46:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.911 06:46:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.911 06:46:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.911 06:46:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.911 06:46:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42850052 kB' 'MemAvailable: 44588904 kB' 'Buffers: 4476 kB' 'Cached: 11420616 kB' 'SwapCached: 8 kB' 'Active: 10503048 kB' 'Inactive: 1585260 kB' 'Active(anon): 9975352 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 666588 kB' 'Mapped: 217596 kB' 'Shmem: 9323160 kB' 'KReclaimable: 267684 kB' 'Slab: 844564 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 576880 kB' 'KernelStack: 22592 kB' 'PageTables: 10144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11306576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216276 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.911 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.911 06:46:43 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.912 06:46:43 -- setup/common.sh@33 -- # echo 0 00:04:13.912 06:46:43 -- setup/common.sh@33 -- # return 0 00:04:13.912 06:46:43 -- setup/hugepages.sh@97 -- # anon=0 00:04:13.912 06:46:43 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:13.912 06:46:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.912 06:46:43 -- setup/common.sh@18 -- # local node= 00:04:13.912 06:46:43 -- setup/common.sh@19 -- # local var val 00:04:13.912 06:46:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.912 06:46:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.912 06:46:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.912 06:46:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.912 06:46:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.912 06:46:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42857136 kB' 'MemAvailable: 44595988 kB' 'Buffers: 4476 kB' 'Cached: 11420620 kB' 'SwapCached: 8 kB' 'Active: 10502236 kB' 'Inactive: 1585260 kB' 'Active(anon): 9974540 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 665820 kB' 'Mapped: 217584 kB' 'Shmem: 9323164 kB' 'KReclaimable: 267684 kB' 'Slab: 844460 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 576776 kB' 'KernelStack: 22640 kB' 'PageTables: 10532 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11302900 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216148 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.912 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.912 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.913 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.913 06:46:43 -- setup/common.sh@33 -- # echo 0 00:04:13.913 06:46:43 -- setup/common.sh@33 -- # return 0 00:04:13.913 06:46:43 -- setup/hugepages.sh@99 -- # surp=0 00:04:13.913 06:46:43 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:13.913 06:46:43 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:13.913 06:46:43 -- setup/common.sh@18 -- # local node= 00:04:13.913 06:46:43 -- setup/common.sh@19 -- # local var val 00:04:13.913 06:46:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.913 06:46:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.913 06:46:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.913 06:46:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.913 06:46:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.913 06:46:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.913 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42857096 kB' 'MemAvailable: 44595948 kB' 'Buffers: 4476 kB' 'Cached: 11420628 kB' 'SwapCached: 8 kB' 'Active: 10502728 kB' 'Inactive: 1585260 kB' 'Active(anon): 9975032 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 666192 kB' 'Mapped: 217584 kB' 'Shmem: 9323172 kB' 'KReclaimable: 267684 kB' 'Slab: 844376 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 576692 kB' 'KernelStack: 22624 kB' 'PageTables: 10288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11304060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216212 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.914 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.914 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.915 06:46:43 -- setup/common.sh@33 -- # echo 0 00:04:13.915 06:46:43 -- setup/common.sh@33 -- # return 0 00:04:13.915 06:46:43 -- setup/hugepages.sh@100 -- # resv=0 00:04:13.915 06:46:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:13.915 nr_hugepages=1024 00:04:13.915 06:46:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:13.915 resv_hugepages=0 00:04:13.915 06:46:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:13.915 surplus_hugepages=0 00:04:13.915 06:46:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:13.915 anon_hugepages=0 00:04:13.915 06:46:43 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:13.915 06:46:43 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:13.915 06:46:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:13.915 06:46:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:13.915 06:46:43 -- setup/common.sh@18 -- # local node= 00:04:13.915 06:46:43 -- setup/common.sh@19 -- # local var val 00:04:13.915 06:46:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.915 06:46:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.915 06:46:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.915 06:46:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.915 06:46:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.915 06:46:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 42858040 kB' 'MemAvailable: 44596892 kB' 'Buffers: 4476 kB' 'Cached: 11420644 kB' 'SwapCached: 8 kB' 'Active: 10501768 kB' 'Inactive: 1585260 kB' 'Active(anon): 9974072 kB' 'Inactive(anon): 11024 kB' 'Active(file): 527696 kB' 'Inactive(file): 1574236 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8384764 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 665212 kB' 'Mapped: 217584 kB' 'Shmem: 9323188 kB' 'KReclaimable: 267684 kB' 'Slab: 844376 kB' 'SReclaimable: 267684 kB' 'SUnreclaim: 576692 kB' 'KernelStack: 22480 kB' 'PageTables: 9332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 11304072 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216180 kB' 'VmallocChunk: 0 kB' 'Percpu: 79296 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2037108 kB' 'DirectMap2M: 16523264 kB' 'DirectMap1G: 50331648 kB' 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.915 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.915 06:46:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.916 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.916 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.916 06:46:43 -- setup/common.sh@33 -- # echo 1024 00:04:13.917 06:46:43 -- setup/common.sh@33 -- # return 0 00:04:13.917 06:46:43 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:13.917 06:46:43 -- setup/hugepages.sh@112 -- # get_nodes 00:04:13.917 06:46:43 -- setup/hugepages.sh@27 -- # local node 00:04:13.917 06:46:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.917 06:46:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:13.917 06:46:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.917 06:46:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:13.917 06:46:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:13.917 06:46:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:13.917 06:46:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.917 06:46:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.917 06:46:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:13.917 06:46:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.917 06:46:43 -- setup/common.sh@18 -- # local node=0 00:04:13.917 06:46:43 -- setup/common.sh@19 -- # local var val 00:04:13.917 06:46:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.917 06:46:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.917 06:46:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:13.917 06:46:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:13.917 06:46:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.917 06:46:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25326056 kB' 'MemUsed: 7266028 kB' 'SwapCached: 8 kB' 'Active: 3922568 kB' 'Inactive: 187304 kB' 'Active(anon): 3625400 kB' 'Inactive(anon): 8 kB' 'Active(file): 297168 kB' 'Inactive(file): 187296 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3956504 kB' 'Mapped: 166436 kB' 'AnonPages: 156640 kB' 'Shmem: 3472032 kB' 'KernelStack: 11864 kB' 'PageTables: 4188 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124480 kB' 'Slab: 390468 kB' 'SReclaimable: 124480 kB' 'SUnreclaim: 265988 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.917 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.917 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.918 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.918 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.918 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.918 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.918 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.918 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # continue 00:04:13.918 06:46:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.918 06:46:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.918 06:46:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.918 06:46:43 -- setup/common.sh@33 -- # echo 0 00:04:13.918 06:46:43 -- setup/common.sh@33 -- # return 0 00:04:13.918 06:46:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.918 06:46:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.918 06:46:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.918 06:46:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.918 06:46:43 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:13.918 node0=1024 expecting 1024 00:04:13.918 06:46:43 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:13.918 00:04:13.918 real 0m6.800s 00:04:13.918 user 0m2.450s 00:04:13.918 sys 0m4.310s 00:04:13.918 06:46:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.918 06:46:43 -- common/autotest_common.sh@10 -- # set +x 00:04:13.918 ************************************ 00:04:13.918 END TEST no_shrink_alloc 00:04:13.918 ************************************ 00:04:13.918 06:46:43 -- setup/hugepages.sh@217 -- # clear_hp 00:04:13.918 06:46:43 -- setup/hugepages.sh@37 -- # local node hp 00:04:13.918 06:46:43 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:13.918 06:46:43 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:13.918 06:46:43 -- setup/hugepages.sh@41 -- # echo 0 00:04:13.918 06:46:43 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:13.918 06:46:43 -- setup/hugepages.sh@41 -- # echo 0 00:04:13.918 06:46:43 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:13.918 06:46:43 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:13.918 06:46:43 -- setup/hugepages.sh@41 -- # echo 0 00:04:13.918 06:46:43 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:13.918 06:46:43 -- setup/hugepages.sh@41 -- # echo 0 00:04:13.918 06:46:43 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:13.918 06:46:43 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:13.918 00:04:13.918 real 0m27.207s 00:04:13.918 user 0m9.606s 00:04:13.918 sys 0m16.451s 00:04:13.918 06:46:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.918 06:46:43 -- common/autotest_common.sh@10 -- # set +x 00:04:13.918 ************************************ 00:04:13.918 END TEST hugepages 00:04:13.918 ************************************ 00:04:13.918 06:46:43 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:13.918 06:46:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:13.918 06:46:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:13.918 06:46:43 -- common/autotest_common.sh@10 -- # set +x 00:04:13.918 ************************************ 00:04:13.918 START TEST driver 00:04:13.918 ************************************ 00:04:13.918 06:46:43 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:13.918 * Looking for test storage... 00:04:13.918 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:13.918 06:46:43 -- setup/driver.sh@68 -- # setup reset 00:04:13.918 06:46:43 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:13.918 06:46:43 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:19.197 06:46:48 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:19.197 06:46:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:19.197 06:46:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:19.197 06:46:48 -- common/autotest_common.sh@10 -- # set +x 00:04:19.197 ************************************ 00:04:19.197 START TEST guess_driver 00:04:19.197 ************************************ 00:04:19.197 06:46:48 -- common/autotest_common.sh@1104 -- # guess_driver 00:04:19.197 06:46:48 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:19.197 06:46:48 -- setup/driver.sh@47 -- # local fail=0 00:04:19.197 06:46:48 -- setup/driver.sh@49 -- # pick_driver 00:04:19.197 06:46:48 -- setup/driver.sh@36 -- # vfio 00:04:19.197 06:46:48 -- setup/driver.sh@21 -- # local iommu_grups 00:04:19.197 06:46:48 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:19.197 06:46:48 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:19.197 06:46:48 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:19.197 06:46:48 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:19.197 06:46:48 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:19.197 06:46:48 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:19.197 06:46:48 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:19.197 06:46:48 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:19.197 06:46:48 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:19.197 06:46:48 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:19.197 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:19.197 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:19.197 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:19.197 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:19.197 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:19.197 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:19.197 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:19.197 06:46:48 -- setup/driver.sh@30 -- # return 0 00:04:19.197 06:46:48 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:19.197 06:46:48 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:19.197 06:46:48 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:19.197 06:46:48 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:19.197 Looking for driver=vfio-pci 00:04:19.197 06:46:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:19.197 06:46:48 -- setup/driver.sh@45 -- # setup output config 00:04:19.197 06:46:48 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:19.197 06:46:48 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:22.489 06:46:51 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:22.489 06:46:51 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:22.489 06:46:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:23.923 06:46:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:23.923 06:46:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:23.923 06:46:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:23.923 06:46:53 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:23.923 06:46:53 -- setup/driver.sh@65 -- # setup reset 00:04:23.923 06:46:53 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:23.923 06:46:53 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:29.198 00:04:29.198 real 0m9.906s 00:04:29.198 user 0m2.683s 00:04:29.198 sys 0m4.948s 00:04:29.198 06:46:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:29.198 06:46:58 -- common/autotest_common.sh@10 -- # set +x 00:04:29.198 ************************************ 00:04:29.198 END TEST guess_driver 00:04:29.198 ************************************ 00:04:29.198 00:04:29.198 real 0m14.854s 00:04:29.198 user 0m4.065s 00:04:29.198 sys 0m7.738s 00:04:29.198 06:46:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:29.198 06:46:58 -- common/autotest_common.sh@10 -- # set +x 00:04:29.198 ************************************ 00:04:29.198 END TEST driver 00:04:29.198 ************************************ 00:04:29.198 06:46:58 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:29.198 06:46:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:29.198 06:46:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:29.198 06:46:58 -- common/autotest_common.sh@10 -- # set +x 00:04:29.198 ************************************ 00:04:29.198 START TEST devices 00:04:29.198 ************************************ 00:04:29.198 06:46:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:29.198 * Looking for test storage... 00:04:29.198 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:29.198 06:46:58 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:29.198 06:46:58 -- setup/devices.sh@192 -- # setup reset 00:04:29.198 06:46:58 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:29.198 06:46:58 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:32.493 06:47:02 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:32.493 06:47:02 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:32.493 06:47:02 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:32.493 06:47:02 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:32.493 06:47:02 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:32.493 06:47:02 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:32.493 06:47:02 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:32.493 06:47:02 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:32.493 06:47:02 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:32.493 06:47:02 -- setup/devices.sh@196 -- # blocks=() 00:04:32.493 06:47:02 -- setup/devices.sh@196 -- # declare -a blocks 00:04:32.493 06:47:02 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:32.493 06:47:02 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:32.493 06:47:02 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:32.493 06:47:02 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:32.493 06:47:02 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:32.493 06:47:02 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:32.493 06:47:02 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:32.493 06:47:02 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:32.493 06:47:02 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:32.493 06:47:02 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:32.493 06:47:02 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:32.493 No valid GPT data, bailing 00:04:32.493 06:47:02 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:32.493 06:47:02 -- scripts/common.sh@393 -- # pt= 00:04:32.493 06:47:02 -- scripts/common.sh@394 -- # return 1 00:04:32.493 06:47:02 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:32.493 06:47:02 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:32.493 06:47:02 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:32.493 06:47:02 -- setup/common.sh@80 -- # echo 1600321314816 00:04:32.493 06:47:02 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:32.493 06:47:02 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:32.493 06:47:02 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:32.493 06:47:02 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:32.493 06:47:02 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:32.493 06:47:02 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:32.493 06:47:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:32.493 06:47:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:32.493 06:47:02 -- common/autotest_common.sh@10 -- # set +x 00:04:32.493 ************************************ 00:04:32.493 START TEST nvme_mount 00:04:32.493 ************************************ 00:04:32.493 06:47:02 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:32.493 06:47:02 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:32.493 06:47:02 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:32.493 06:47:02 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:32.493 06:47:02 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:32.493 06:47:02 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:32.493 06:47:02 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:32.493 06:47:02 -- setup/common.sh@40 -- # local part_no=1 00:04:32.493 06:47:02 -- setup/common.sh@41 -- # local size=1073741824 00:04:32.753 06:47:02 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:32.753 06:47:02 -- setup/common.sh@44 -- # parts=() 00:04:32.753 06:47:02 -- setup/common.sh@44 -- # local parts 00:04:32.753 06:47:02 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:32.753 06:47:02 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:32.753 06:47:02 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:32.753 06:47:02 -- setup/common.sh@46 -- # (( part++ )) 00:04:32.753 06:47:02 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:32.753 06:47:02 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:32.753 06:47:02 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:32.753 06:47:02 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:33.692 Creating new GPT entries in memory. 00:04:33.692 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:33.692 other utilities. 00:04:33.692 06:47:03 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:33.693 06:47:03 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:33.693 06:47:03 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:33.693 06:47:03 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:33.693 06:47:03 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:34.633 Creating new GPT entries in memory. 00:04:34.633 The operation has completed successfully. 00:04:34.633 06:47:04 -- setup/common.sh@57 -- # (( part++ )) 00:04:34.633 06:47:04 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:34.633 06:47:04 -- setup/common.sh@62 -- # wait 2579282 00:04:34.633 06:47:04 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:34.633 06:47:04 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:34.633 06:47:04 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:34.633 06:47:04 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:34.633 06:47:04 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:34.633 06:47:04 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:34.633 06:47:04 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:34.633 06:47:04 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:34.633 06:47:04 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:34.633 06:47:04 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:34.633 06:47:04 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:34.633 06:47:04 -- setup/devices.sh@53 -- # local found=0 00:04:34.633 06:47:04 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:34.633 06:47:04 -- setup/devices.sh@56 -- # : 00:04:34.633 06:47:04 -- setup/devices.sh@59 -- # local pci status 00:04:34.633 06:47:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.633 06:47:04 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:34.633 06:47:04 -- setup/devices.sh@47 -- # setup output config 00:04:34.633 06:47:04 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:34.633 06:47:04 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:37.928 06:47:07 -- setup/devices.sh@63 -- # found=1 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.928 06:47:07 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:37.928 06:47:07 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:37.928 06:47:07 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:37.928 06:47:07 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:37.928 06:47:07 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:37.928 06:47:07 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:37.928 06:47:07 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:37.928 06:47:07 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:37.928 06:47:07 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:37.928 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:37.928 06:47:07 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:37.928 06:47:07 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:38.188 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:38.188 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:38.188 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:38.188 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:38.188 06:47:08 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:38.188 06:47:08 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:38.188 06:47:08 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:38.188 06:47:08 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:38.188 06:47:08 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:38.447 06:47:08 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:38.447 06:47:08 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:38.447 06:47:08 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:38.447 06:47:08 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:38.447 06:47:08 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:38.447 06:47:08 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:38.447 06:47:08 -- setup/devices.sh@53 -- # local found=0 00:04:38.447 06:47:08 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:38.447 06:47:08 -- setup/devices.sh@56 -- # : 00:04:38.447 06:47:08 -- setup/devices.sh@59 -- # local pci status 00:04:38.447 06:47:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.447 06:47:08 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:38.447 06:47:08 -- setup/devices.sh@47 -- # setup output config 00:04:38.447 06:47:08 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:38.447 06:47:08 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:41.743 06:47:11 -- setup/devices.sh@63 -- # found=1 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:41.743 06:47:11 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:41.743 06:47:11 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.743 06:47:11 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:41.743 06:47:11 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:41.743 06:47:11 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.743 06:47:11 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:41.743 06:47:11 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:41.743 06:47:11 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:41.743 06:47:11 -- setup/devices.sh@50 -- # local mount_point= 00:04:41.743 06:47:11 -- setup/devices.sh@51 -- # local test_file= 00:04:41.743 06:47:11 -- setup/devices.sh@53 -- # local found=0 00:04:41.743 06:47:11 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:41.743 06:47:11 -- setup/devices.sh@59 -- # local pci status 00:04:41.743 06:47:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.743 06:47:11 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:41.743 06:47:11 -- setup/devices.sh@47 -- # setup output config 00:04:41.743 06:47:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:41.743 06:47:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:45.036 06:47:14 -- setup/devices.sh@63 -- # found=1 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.036 06:47:14 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:45.036 06:47:14 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:45.036 06:47:14 -- setup/devices.sh@68 -- # return 0 00:04:45.036 06:47:14 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:45.036 06:47:14 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.036 06:47:14 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:45.036 06:47:14 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:45.036 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:45.036 00:04:45.037 real 0m12.486s 00:04:45.037 user 0m3.563s 00:04:45.037 sys 0m6.851s 00:04:45.037 06:47:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:45.037 06:47:14 -- common/autotest_common.sh@10 -- # set +x 00:04:45.037 ************************************ 00:04:45.037 END TEST nvme_mount 00:04:45.037 ************************************ 00:04:45.037 06:47:14 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:45.037 06:47:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:45.037 06:47:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:45.037 06:47:14 -- common/autotest_common.sh@10 -- # set +x 00:04:45.037 ************************************ 00:04:45.037 START TEST dm_mount 00:04:45.037 ************************************ 00:04:45.037 06:47:14 -- common/autotest_common.sh@1104 -- # dm_mount 00:04:45.037 06:47:14 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:45.037 06:47:14 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:45.037 06:47:14 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:45.037 06:47:14 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:45.037 06:47:14 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:45.037 06:47:14 -- setup/common.sh@40 -- # local part_no=2 00:04:45.037 06:47:14 -- setup/common.sh@41 -- # local size=1073741824 00:04:45.037 06:47:14 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:45.037 06:47:14 -- setup/common.sh@44 -- # parts=() 00:04:45.037 06:47:14 -- setup/common.sh@44 -- # local parts 00:04:45.037 06:47:14 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:45.037 06:47:14 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:45.037 06:47:14 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:45.037 06:47:14 -- setup/common.sh@46 -- # (( part++ )) 00:04:45.037 06:47:14 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:45.037 06:47:14 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:45.037 06:47:14 -- setup/common.sh@46 -- # (( part++ )) 00:04:45.037 06:47:14 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:45.037 06:47:14 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:45.037 06:47:14 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:45.037 06:47:14 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:46.417 Creating new GPT entries in memory. 00:04:46.417 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:46.417 other utilities. 00:04:46.417 06:47:15 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:46.417 06:47:15 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:46.417 06:47:15 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:46.417 06:47:15 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:46.417 06:47:15 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:47.414 Creating new GPT entries in memory. 00:04:47.414 The operation has completed successfully. 00:04:47.414 06:47:16 -- setup/common.sh@57 -- # (( part++ )) 00:04:47.414 06:47:16 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:47.414 06:47:16 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:47.414 06:47:16 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:47.414 06:47:16 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:48.351 The operation has completed successfully. 00:04:48.351 06:47:17 -- setup/common.sh@57 -- # (( part++ )) 00:04:48.351 06:47:17 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:48.351 06:47:17 -- setup/common.sh@62 -- # wait 2583786 00:04:48.351 06:47:18 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:48.351 06:47:18 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:48.351 06:47:18 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:48.351 06:47:18 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:48.351 06:47:18 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:48.351 06:47:18 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:48.351 06:47:18 -- setup/devices.sh@161 -- # break 00:04:48.351 06:47:18 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:48.351 06:47:18 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:48.351 06:47:18 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:48.351 06:47:18 -- setup/devices.sh@166 -- # dm=dm-0 00:04:48.351 06:47:18 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:48.351 06:47:18 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:48.351 06:47:18 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:48.351 06:47:18 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:48.351 06:47:18 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:48.351 06:47:18 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:48.351 06:47:18 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:48.351 06:47:18 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:48.351 06:47:18 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:48.351 06:47:18 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:48.351 06:47:18 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:48.351 06:47:18 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:48.351 06:47:18 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:48.351 06:47:18 -- setup/devices.sh@53 -- # local found=0 00:04:48.351 06:47:18 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:48.351 06:47:18 -- setup/devices.sh@56 -- # : 00:04:48.351 06:47:18 -- setup/devices.sh@59 -- # local pci status 00:04:48.351 06:47:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.351 06:47:18 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:48.351 06:47:18 -- setup/devices.sh@47 -- # setup output config 00:04:48.351 06:47:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:48.351 06:47:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:51.643 06:47:21 -- setup/devices.sh@63 -- # found=1 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:51.643 06:47:21 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:51.643 06:47:21 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:51.643 06:47:21 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:51.643 06:47:21 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:51.643 06:47:21 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:51.643 06:47:21 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:51.643 06:47:21 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:51.643 06:47:21 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:51.643 06:47:21 -- setup/devices.sh@50 -- # local mount_point= 00:04:51.643 06:47:21 -- setup/devices.sh@51 -- # local test_file= 00:04:51.643 06:47:21 -- setup/devices.sh@53 -- # local found=0 00:04:51.643 06:47:21 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:51.643 06:47:21 -- setup/devices.sh@59 -- # local pci status 00:04:51.643 06:47:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.643 06:47:21 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:51.643 06:47:21 -- setup/devices.sh@47 -- # setup output config 00:04:51.643 06:47:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.643 06:47:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:54.937 06:47:24 -- setup/devices.sh@63 -- # found=1 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.937 06:47:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.937 06:47:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.197 06:47:24 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:55.197 06:47:24 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:55.197 06:47:24 -- setup/devices.sh@68 -- # return 0 00:04:55.197 06:47:24 -- setup/devices.sh@187 -- # cleanup_dm 00:04:55.197 06:47:24 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:55.197 06:47:24 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:55.197 06:47:24 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:55.197 06:47:24 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:55.197 06:47:24 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:55.197 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:55.197 06:47:24 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:55.197 06:47:24 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:55.197 00:04:55.197 real 0m10.054s 00:04:55.197 user 0m2.466s 00:04:55.197 sys 0m4.685s 00:04:55.197 06:47:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:55.197 06:47:24 -- common/autotest_common.sh@10 -- # set +x 00:04:55.197 ************************************ 00:04:55.197 END TEST dm_mount 00:04:55.197 ************************************ 00:04:55.197 06:47:25 -- setup/devices.sh@1 -- # cleanup 00:04:55.197 06:47:25 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:55.197 06:47:25 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.197 06:47:25 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:55.197 06:47:25 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:55.197 06:47:25 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:55.197 06:47:25 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:55.457 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:55.457 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:55.457 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:55.457 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:55.457 06:47:25 -- setup/devices.sh@12 -- # cleanup_dm 00:04:55.457 06:47:25 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:55.457 06:47:25 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:55.457 06:47:25 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:55.457 06:47:25 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:55.457 06:47:25 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:55.457 06:47:25 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:55.457 00:04:55.457 real 0m26.823s 00:04:55.457 user 0m7.526s 00:04:55.457 sys 0m14.250s 00:04:55.457 06:47:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:55.457 06:47:25 -- common/autotest_common.sh@10 -- # set +x 00:04:55.457 ************************************ 00:04:55.457 END TEST devices 00:04:55.457 ************************************ 00:04:55.457 00:04:55.457 real 1m33.510s 00:04:55.457 user 0m28.957s 00:04:55.457 sys 0m53.417s 00:04:55.457 06:47:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:55.457 06:47:25 -- common/autotest_common.sh@10 -- # set +x 00:04:55.457 ************************************ 00:04:55.457 END TEST setup.sh 00:04:55.457 ************************************ 00:04:55.716 06:47:25 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:59.006 Hugepages 00:04:59.006 node hugesize free / total 00:04:59.006 node0 1048576kB 0 / 0 00:04:59.006 node0 2048kB 2048 / 2048 00:04:59.006 node1 1048576kB 0 / 0 00:04:59.006 node1 2048kB 0 / 0 00:04:59.006 00:04:59.006 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:59.006 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:59.006 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:59.006 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:59.006 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:59.006 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:59.006 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:59.006 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:59.006 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:59.006 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:59.006 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:59.006 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:59.006 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:59.006 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:59.006 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:59.006 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:59.006 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:59.006 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:59.006 06:47:28 -- spdk/autotest.sh@141 -- # uname -s 00:04:59.006 06:47:28 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:04:59.006 06:47:28 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:04:59.006 06:47:28 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:02.340 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:02.340 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:04.249 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:04.249 06:47:33 -- common/autotest_common.sh@1517 -- # sleep 1 00:05:05.188 06:47:34 -- common/autotest_common.sh@1518 -- # bdfs=() 00:05:05.188 06:47:34 -- common/autotest_common.sh@1518 -- # local bdfs 00:05:05.188 06:47:34 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:05:05.188 06:47:34 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:05:05.188 06:47:34 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:05.188 06:47:34 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:05.188 06:47:34 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:05.188 06:47:34 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:05.188 06:47:34 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:05.188 06:47:34 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:05.188 06:47:34 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:05.188 06:47:34 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:08.481 Waiting for block devices as requested 00:05:08.481 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:08.481 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:08.481 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:08.740 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:08.740 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:08.740 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:08.740 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:08.999 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:08.999 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:09.000 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:09.259 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:09.259 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:09.259 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:09.518 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:09.518 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:09.518 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:09.777 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:09.777 06:47:39 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:05:09.777 06:47:39 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:09.777 06:47:39 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:05:09.777 06:47:39 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:05:09.777 06:47:39 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:09.777 06:47:39 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:09.778 06:47:39 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:09.778 06:47:39 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:05:09.778 06:47:39 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:05:09.778 06:47:39 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:05:09.778 06:47:39 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:05:09.778 06:47:39 -- common/autotest_common.sh@1530 -- # grep oacs 00:05:09.778 06:47:39 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:05:09.778 06:47:39 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:05:09.778 06:47:39 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:05:09.778 06:47:39 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:05:09.778 06:47:39 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:05:09.778 06:47:39 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:05:09.778 06:47:39 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:05:09.778 06:47:39 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:05:09.778 06:47:39 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:05:09.778 06:47:39 -- common/autotest_common.sh@1542 -- # continue 00:05:09.778 06:47:39 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:05:09.778 06:47:39 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:09.778 06:47:39 -- common/autotest_common.sh@10 -- # set +x 00:05:09.778 06:47:39 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:05:09.778 06:47:39 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:09.778 06:47:39 -- common/autotest_common.sh@10 -- # set +x 00:05:10.037 06:47:39 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:13.359 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:13.359 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:15.271 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:15.271 06:47:44 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:05:15.271 06:47:44 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:15.271 06:47:44 -- common/autotest_common.sh@10 -- # set +x 00:05:15.271 06:47:44 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:05:15.271 06:47:44 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:05:15.271 06:47:44 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:05:15.271 06:47:44 -- common/autotest_common.sh@1562 -- # bdfs=() 00:05:15.271 06:47:44 -- common/autotest_common.sh@1562 -- # local bdfs 00:05:15.271 06:47:44 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:05:15.271 06:47:44 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:15.271 06:47:44 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:15.271 06:47:44 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:15.271 06:47:44 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:15.271 06:47:44 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:15.271 06:47:44 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:15.271 06:47:44 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:15.271 06:47:44 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:05:15.271 06:47:44 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:15.271 06:47:44 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:05:15.271 06:47:44 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:15.271 06:47:44 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:05:15.271 06:47:44 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:05:15.271 06:47:44 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:05:15.271 06:47:44 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=2593762 00:05:15.271 06:47:44 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:15.271 06:47:44 -- common/autotest_common.sh@1583 -- # waitforlisten 2593762 00:05:15.271 06:47:44 -- common/autotest_common.sh@819 -- # '[' -z 2593762 ']' 00:05:15.271 06:47:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:15.271 06:47:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:15.271 06:47:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:15.271 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:15.271 06:47:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:15.271 06:47:44 -- common/autotest_common.sh@10 -- # set +x 00:05:15.271 [2024-04-27 06:47:44.987801] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:15.271 [2024-04-27 06:47:44.987882] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2593762 ] 00:05:15.271 EAL: No free 2048 kB hugepages reported on node 1 00:05:15.271 [2024-04-27 06:47:45.057338] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:15.271 [2024-04-27 06:47:45.093191] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:15.271 [2024-04-27 06:47:45.093326] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:16.211 06:47:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:16.211 06:47:45 -- common/autotest_common.sh@852 -- # return 0 00:05:16.211 06:47:45 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:05:16.211 06:47:45 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:05:16.211 06:47:45 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:19.505 nvme0n1 00:05:19.505 06:47:48 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:19.505 [2024-04-27 06:47:48.946400] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:19.505 request: 00:05:19.505 { 00:05:19.505 "nvme_ctrlr_name": "nvme0", 00:05:19.505 "password": "test", 00:05:19.505 "method": "bdev_nvme_opal_revert", 00:05:19.505 "req_id": 1 00:05:19.505 } 00:05:19.505 Got JSON-RPC error response 00:05:19.505 response: 00:05:19.505 { 00:05:19.505 "code": -32602, 00:05:19.505 "message": "Invalid parameters" 00:05:19.505 } 00:05:19.505 06:47:48 -- common/autotest_common.sh@1589 -- # true 00:05:19.505 06:47:48 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:05:19.505 06:47:48 -- common/autotest_common.sh@1593 -- # killprocess 2593762 00:05:19.505 06:47:48 -- common/autotest_common.sh@926 -- # '[' -z 2593762 ']' 00:05:19.505 06:47:48 -- common/autotest_common.sh@930 -- # kill -0 2593762 00:05:19.505 06:47:48 -- common/autotest_common.sh@931 -- # uname 00:05:19.505 06:47:48 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:19.505 06:47:48 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2593762 00:05:19.505 06:47:49 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:19.505 06:47:49 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:19.505 06:47:49 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2593762' 00:05:19.505 killing process with pid 2593762 00:05:19.505 06:47:49 -- common/autotest_common.sh@945 -- # kill 2593762 00:05:19.505 06:47:49 -- common/autotest_common.sh@950 -- # wait 2593762 00:05:21.411 06:47:51 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:05:21.411 06:47:51 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:05:21.411 06:47:51 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:21.411 06:47:51 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:21.411 06:47:51 -- spdk/autotest.sh@173 -- # timing_enter lib 00:05:21.411 06:47:51 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:21.411 06:47:51 -- common/autotest_common.sh@10 -- # set +x 00:05:21.411 06:47:51 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:21.411 06:47:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:21.411 06:47:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:21.411 06:47:51 -- common/autotest_common.sh@10 -- # set +x 00:05:21.411 ************************************ 00:05:21.411 START TEST env 00:05:21.411 ************************************ 00:05:21.411 06:47:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:21.411 * Looking for test storage... 00:05:21.672 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:21.672 06:47:51 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:21.672 06:47:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:21.672 06:47:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:21.672 06:47:51 -- common/autotest_common.sh@10 -- # set +x 00:05:21.672 ************************************ 00:05:21.672 START TEST env_memory 00:05:21.672 ************************************ 00:05:21.672 06:47:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:21.672 00:05:21.672 00:05:21.672 CUnit - A unit testing framework for C - Version 2.1-3 00:05:21.672 http://cunit.sourceforge.net/ 00:05:21.672 00:05:21.672 00:05:21.672 Suite: memory 00:05:21.672 Test: alloc and free memory map ...[2024-04-27 06:47:51.351799] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:21.672 passed 00:05:21.672 Test: mem map translation ...[2024-04-27 06:47:51.365440] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:21.672 [2024-04-27 06:47:51.365457] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:21.672 [2024-04-27 06:47:51.365485] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:21.672 [2024-04-27 06:47:51.365494] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:21.672 passed 00:05:21.672 Test: mem map registration ...[2024-04-27 06:47:51.386227] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:21.672 [2024-04-27 06:47:51.386242] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:21.672 passed 00:05:21.672 Test: mem map adjacent registrations ...passed 00:05:21.672 00:05:21.672 Run Summary: Type Total Ran Passed Failed Inactive 00:05:21.672 suites 1 1 n/a 0 0 00:05:21.672 tests 4 4 4 0 0 00:05:21.672 asserts 152 152 152 0 n/a 00:05:21.672 00:05:21.672 Elapsed time = 0.086 seconds 00:05:21.672 00:05:21.672 real 0m0.098s 00:05:21.672 user 0m0.089s 00:05:21.672 sys 0m0.009s 00:05:21.672 06:47:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:21.672 06:47:51 -- common/autotest_common.sh@10 -- # set +x 00:05:21.672 ************************************ 00:05:21.672 END TEST env_memory 00:05:21.672 ************************************ 00:05:21.672 06:47:51 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:21.672 06:47:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:21.672 06:47:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:21.672 06:47:51 -- common/autotest_common.sh@10 -- # set +x 00:05:21.672 ************************************ 00:05:21.672 START TEST env_vtophys 00:05:21.672 ************************************ 00:05:21.672 06:47:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:21.672 EAL: lib.eal log level changed from notice to debug 00:05:21.672 EAL: Detected lcore 0 as core 0 on socket 0 00:05:21.672 EAL: Detected lcore 1 as core 1 on socket 0 00:05:21.672 EAL: Detected lcore 2 as core 2 on socket 0 00:05:21.672 EAL: Detected lcore 3 as core 3 on socket 0 00:05:21.672 EAL: Detected lcore 4 as core 4 on socket 0 00:05:21.672 EAL: Detected lcore 5 as core 5 on socket 0 00:05:21.672 EAL: Detected lcore 6 as core 6 on socket 0 00:05:21.672 EAL: Detected lcore 7 as core 8 on socket 0 00:05:21.672 EAL: Detected lcore 8 as core 9 on socket 0 00:05:21.672 EAL: Detected lcore 9 as core 10 on socket 0 00:05:21.672 EAL: Detected lcore 10 as core 11 on socket 0 00:05:21.672 EAL: Detected lcore 11 as core 12 on socket 0 00:05:21.672 EAL: Detected lcore 12 as core 13 on socket 0 00:05:21.672 EAL: Detected lcore 13 as core 14 on socket 0 00:05:21.672 EAL: Detected lcore 14 as core 16 on socket 0 00:05:21.672 EAL: Detected lcore 15 as core 17 on socket 0 00:05:21.672 EAL: Detected lcore 16 as core 18 on socket 0 00:05:21.672 EAL: Detected lcore 17 as core 19 on socket 0 00:05:21.672 EAL: Detected lcore 18 as core 20 on socket 0 00:05:21.672 EAL: Detected lcore 19 as core 21 on socket 0 00:05:21.672 EAL: Detected lcore 20 as core 22 on socket 0 00:05:21.672 EAL: Detected lcore 21 as core 24 on socket 0 00:05:21.672 EAL: Detected lcore 22 as core 25 on socket 0 00:05:21.672 EAL: Detected lcore 23 as core 26 on socket 0 00:05:21.672 EAL: Detected lcore 24 as core 27 on socket 0 00:05:21.672 EAL: Detected lcore 25 as core 28 on socket 0 00:05:21.672 EAL: Detected lcore 26 as core 29 on socket 0 00:05:21.672 EAL: Detected lcore 27 as core 30 on socket 0 00:05:21.672 EAL: Detected lcore 28 as core 0 on socket 1 00:05:21.672 EAL: Detected lcore 29 as core 1 on socket 1 00:05:21.672 EAL: Detected lcore 30 as core 2 on socket 1 00:05:21.672 EAL: Detected lcore 31 as core 3 on socket 1 00:05:21.672 EAL: Detected lcore 32 as core 4 on socket 1 00:05:21.672 EAL: Detected lcore 33 as core 5 on socket 1 00:05:21.672 EAL: Detected lcore 34 as core 6 on socket 1 00:05:21.672 EAL: Detected lcore 35 as core 8 on socket 1 00:05:21.672 EAL: Detected lcore 36 as core 9 on socket 1 00:05:21.672 EAL: Detected lcore 37 as core 10 on socket 1 00:05:21.672 EAL: Detected lcore 38 as core 11 on socket 1 00:05:21.672 EAL: Detected lcore 39 as core 12 on socket 1 00:05:21.672 EAL: Detected lcore 40 as core 13 on socket 1 00:05:21.672 EAL: Detected lcore 41 as core 14 on socket 1 00:05:21.672 EAL: Detected lcore 42 as core 16 on socket 1 00:05:21.673 EAL: Detected lcore 43 as core 17 on socket 1 00:05:21.673 EAL: Detected lcore 44 as core 18 on socket 1 00:05:21.673 EAL: Detected lcore 45 as core 19 on socket 1 00:05:21.673 EAL: Detected lcore 46 as core 20 on socket 1 00:05:21.673 EAL: Detected lcore 47 as core 21 on socket 1 00:05:21.673 EAL: Detected lcore 48 as core 22 on socket 1 00:05:21.673 EAL: Detected lcore 49 as core 24 on socket 1 00:05:21.673 EAL: Detected lcore 50 as core 25 on socket 1 00:05:21.673 EAL: Detected lcore 51 as core 26 on socket 1 00:05:21.673 EAL: Detected lcore 52 as core 27 on socket 1 00:05:21.673 EAL: Detected lcore 53 as core 28 on socket 1 00:05:21.673 EAL: Detected lcore 54 as core 29 on socket 1 00:05:21.673 EAL: Detected lcore 55 as core 30 on socket 1 00:05:21.673 EAL: Detected lcore 56 as core 0 on socket 0 00:05:21.673 EAL: Detected lcore 57 as core 1 on socket 0 00:05:21.673 EAL: Detected lcore 58 as core 2 on socket 0 00:05:21.673 EAL: Detected lcore 59 as core 3 on socket 0 00:05:21.673 EAL: Detected lcore 60 as core 4 on socket 0 00:05:21.673 EAL: Detected lcore 61 as core 5 on socket 0 00:05:21.673 EAL: Detected lcore 62 as core 6 on socket 0 00:05:21.673 EAL: Detected lcore 63 as core 8 on socket 0 00:05:21.673 EAL: Detected lcore 64 as core 9 on socket 0 00:05:21.673 EAL: Detected lcore 65 as core 10 on socket 0 00:05:21.673 EAL: Detected lcore 66 as core 11 on socket 0 00:05:21.673 EAL: Detected lcore 67 as core 12 on socket 0 00:05:21.673 EAL: Detected lcore 68 as core 13 on socket 0 00:05:21.673 EAL: Detected lcore 69 as core 14 on socket 0 00:05:21.673 EAL: Detected lcore 70 as core 16 on socket 0 00:05:21.673 EAL: Detected lcore 71 as core 17 on socket 0 00:05:21.673 EAL: Detected lcore 72 as core 18 on socket 0 00:05:21.673 EAL: Detected lcore 73 as core 19 on socket 0 00:05:21.673 EAL: Detected lcore 74 as core 20 on socket 0 00:05:21.673 EAL: Detected lcore 75 as core 21 on socket 0 00:05:21.673 EAL: Detected lcore 76 as core 22 on socket 0 00:05:21.673 EAL: Detected lcore 77 as core 24 on socket 0 00:05:21.673 EAL: Detected lcore 78 as core 25 on socket 0 00:05:21.673 EAL: Detected lcore 79 as core 26 on socket 0 00:05:21.673 EAL: Detected lcore 80 as core 27 on socket 0 00:05:21.673 EAL: Detected lcore 81 as core 28 on socket 0 00:05:21.673 EAL: Detected lcore 82 as core 29 on socket 0 00:05:21.673 EAL: Detected lcore 83 as core 30 on socket 0 00:05:21.673 EAL: Detected lcore 84 as core 0 on socket 1 00:05:21.673 EAL: Detected lcore 85 as core 1 on socket 1 00:05:21.673 EAL: Detected lcore 86 as core 2 on socket 1 00:05:21.673 EAL: Detected lcore 87 as core 3 on socket 1 00:05:21.673 EAL: Detected lcore 88 as core 4 on socket 1 00:05:21.673 EAL: Detected lcore 89 as core 5 on socket 1 00:05:21.673 EAL: Detected lcore 90 as core 6 on socket 1 00:05:21.673 EAL: Detected lcore 91 as core 8 on socket 1 00:05:21.673 EAL: Detected lcore 92 as core 9 on socket 1 00:05:21.673 EAL: Detected lcore 93 as core 10 on socket 1 00:05:21.673 EAL: Detected lcore 94 as core 11 on socket 1 00:05:21.673 EAL: Detected lcore 95 as core 12 on socket 1 00:05:21.673 EAL: Detected lcore 96 as core 13 on socket 1 00:05:21.673 EAL: Detected lcore 97 as core 14 on socket 1 00:05:21.673 EAL: Detected lcore 98 as core 16 on socket 1 00:05:21.673 EAL: Detected lcore 99 as core 17 on socket 1 00:05:21.673 EAL: Detected lcore 100 as core 18 on socket 1 00:05:21.673 EAL: Detected lcore 101 as core 19 on socket 1 00:05:21.673 EAL: Detected lcore 102 as core 20 on socket 1 00:05:21.673 EAL: Detected lcore 103 as core 21 on socket 1 00:05:21.673 EAL: Detected lcore 104 as core 22 on socket 1 00:05:21.673 EAL: Detected lcore 105 as core 24 on socket 1 00:05:21.673 EAL: Detected lcore 106 as core 25 on socket 1 00:05:21.673 EAL: Detected lcore 107 as core 26 on socket 1 00:05:21.673 EAL: Detected lcore 108 as core 27 on socket 1 00:05:21.673 EAL: Detected lcore 109 as core 28 on socket 1 00:05:21.673 EAL: Detected lcore 110 as core 29 on socket 1 00:05:21.673 EAL: Detected lcore 111 as core 30 on socket 1 00:05:21.673 EAL: Maximum logical cores by configuration: 128 00:05:21.673 EAL: Detected CPU lcores: 112 00:05:21.673 EAL: Detected NUMA nodes: 2 00:05:21.673 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:21.673 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:21.673 EAL: Checking presence of .so 'librte_eal.so' 00:05:21.673 EAL: Detected static linkage of DPDK 00:05:21.673 EAL: No shared files mode enabled, IPC will be disabled 00:05:21.673 EAL: Bus pci wants IOVA as 'DC' 00:05:21.673 EAL: Buses did not request a specific IOVA mode. 00:05:21.673 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:21.673 EAL: Selected IOVA mode 'VA' 00:05:21.673 EAL: No free 2048 kB hugepages reported on node 1 00:05:21.673 EAL: Probing VFIO support... 00:05:21.673 EAL: IOMMU type 1 (Type 1) is supported 00:05:21.673 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:21.673 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:21.673 EAL: VFIO support initialized 00:05:21.673 EAL: Ask a virtual area of 0x2e000 bytes 00:05:21.673 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:21.673 EAL: Setting up physically contiguous memory... 00:05:21.673 EAL: Setting maximum number of open files to 524288 00:05:21.673 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:21.673 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:21.673 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:21.673 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.673 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:21.673 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:21.673 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.673 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:21.673 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:21.673 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.673 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:21.673 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:21.673 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.673 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:21.673 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:21.673 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.673 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:21.673 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:21.673 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.673 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:21.673 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:21.673 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.673 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:21.673 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:21.673 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.673 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:21.673 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:21.673 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:21.673 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.673 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:21.673 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:21.673 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.673 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:21.673 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:21.673 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.673 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:21.673 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:21.673 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.673 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:21.673 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:21.673 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.673 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:21.673 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:21.674 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.674 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:21.674 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:21.674 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.674 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:21.674 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:21.674 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.674 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:21.674 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:21.674 EAL: Hugepages will be freed exactly as allocated. 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: TSC frequency is ~2500000 KHz 00:05:21.674 EAL: Main lcore 0 is ready (tid=7fa07af69a00;cpuset=[0]) 00:05:21.674 EAL: Trying to obtain current memory policy. 00:05:21.674 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.674 EAL: Restoring previous memory policy: 0 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was expanded by 2MB 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Mem event callback 'spdk:(nil)' registered 00:05:21.674 00:05:21.674 00:05:21.674 CUnit - A unit testing framework for C - Version 2.1-3 00:05:21.674 http://cunit.sourceforge.net/ 00:05:21.674 00:05:21.674 00:05:21.674 Suite: components_suite 00:05:21.674 Test: vtophys_malloc_test ...passed 00:05:21.674 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:21.674 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.674 EAL: Restoring previous memory policy: 4 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was expanded by 4MB 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was shrunk by 4MB 00:05:21.674 EAL: Trying to obtain current memory policy. 00:05:21.674 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.674 EAL: Restoring previous memory policy: 4 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was expanded by 6MB 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was shrunk by 6MB 00:05:21.674 EAL: Trying to obtain current memory policy. 00:05:21.674 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.674 EAL: Restoring previous memory policy: 4 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was expanded by 10MB 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was shrunk by 10MB 00:05:21.674 EAL: Trying to obtain current memory policy. 00:05:21.674 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.674 EAL: Restoring previous memory policy: 4 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was expanded by 18MB 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was shrunk by 18MB 00:05:21.674 EAL: Trying to obtain current memory policy. 00:05:21.674 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.674 EAL: Restoring previous memory policy: 4 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.674 EAL: request: mp_malloc_sync 00:05:21.674 EAL: No shared files mode enabled, IPC is disabled 00:05:21.674 EAL: Heap on socket 0 was expanded by 34MB 00:05:21.674 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.934 EAL: request: mp_malloc_sync 00:05:21.934 EAL: No shared files mode enabled, IPC is disabled 00:05:21.934 EAL: Heap on socket 0 was shrunk by 34MB 00:05:21.934 EAL: Trying to obtain current memory policy. 00:05:21.934 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.934 EAL: Restoring previous memory policy: 4 00:05:21.934 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.934 EAL: request: mp_malloc_sync 00:05:21.934 EAL: No shared files mode enabled, IPC is disabled 00:05:21.934 EAL: Heap on socket 0 was expanded by 66MB 00:05:21.934 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.934 EAL: request: mp_malloc_sync 00:05:21.934 EAL: No shared files mode enabled, IPC is disabled 00:05:21.934 EAL: Heap on socket 0 was shrunk by 66MB 00:05:21.934 EAL: Trying to obtain current memory policy. 00:05:21.934 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.934 EAL: Restoring previous memory policy: 4 00:05:21.934 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.934 EAL: request: mp_malloc_sync 00:05:21.934 EAL: No shared files mode enabled, IPC is disabled 00:05:21.934 EAL: Heap on socket 0 was expanded by 130MB 00:05:21.934 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.934 EAL: request: mp_malloc_sync 00:05:21.934 EAL: No shared files mode enabled, IPC is disabled 00:05:21.934 EAL: Heap on socket 0 was shrunk by 130MB 00:05:21.934 EAL: Trying to obtain current memory policy. 00:05:21.934 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.934 EAL: Restoring previous memory policy: 4 00:05:21.934 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.934 EAL: request: mp_malloc_sync 00:05:21.934 EAL: No shared files mode enabled, IPC is disabled 00:05:21.934 EAL: Heap on socket 0 was expanded by 258MB 00:05:21.934 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.934 EAL: request: mp_malloc_sync 00:05:21.934 EAL: No shared files mode enabled, IPC is disabled 00:05:21.934 EAL: Heap on socket 0 was shrunk by 258MB 00:05:21.934 EAL: Trying to obtain current memory policy. 00:05:21.934 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:22.194 EAL: Restoring previous memory policy: 4 00:05:22.194 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.194 EAL: request: mp_malloc_sync 00:05:22.194 EAL: No shared files mode enabled, IPC is disabled 00:05:22.194 EAL: Heap on socket 0 was expanded by 514MB 00:05:22.194 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.194 EAL: request: mp_malloc_sync 00:05:22.194 EAL: No shared files mode enabled, IPC is disabled 00:05:22.194 EAL: Heap on socket 0 was shrunk by 514MB 00:05:22.194 EAL: Trying to obtain current memory policy. 00:05:22.194 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:22.453 EAL: Restoring previous memory policy: 4 00:05:22.453 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.453 EAL: request: mp_malloc_sync 00:05:22.453 EAL: No shared files mode enabled, IPC is disabled 00:05:22.453 EAL: Heap on socket 0 was expanded by 1026MB 00:05:22.713 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.713 EAL: request: mp_malloc_sync 00:05:22.713 EAL: No shared files mode enabled, IPC is disabled 00:05:22.713 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:22.713 passed 00:05:22.713 00:05:22.713 Run Summary: Type Total Ran Passed Failed Inactive 00:05:22.713 suites 1 1 n/a 0 0 00:05:22.713 tests 2 2 2 0 0 00:05:22.713 asserts 497 497 497 0 n/a 00:05:22.713 00:05:22.713 Elapsed time = 0.962 seconds 00:05:22.713 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.713 EAL: request: mp_malloc_sync 00:05:22.713 EAL: No shared files mode enabled, IPC is disabled 00:05:22.713 EAL: Heap on socket 0 was shrunk by 2MB 00:05:22.713 EAL: No shared files mode enabled, IPC is disabled 00:05:22.713 EAL: No shared files mode enabled, IPC is disabled 00:05:22.713 EAL: No shared files mode enabled, IPC is disabled 00:05:22.713 00:05:22.713 real 0m1.078s 00:05:22.713 user 0m0.625s 00:05:22.713 sys 0m0.428s 00:05:22.713 06:47:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:22.713 06:47:52 -- common/autotest_common.sh@10 -- # set +x 00:05:22.713 ************************************ 00:05:22.713 END TEST env_vtophys 00:05:22.713 ************************************ 00:05:22.713 06:47:52 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:22.713 06:47:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:22.713 06:47:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:22.713 06:47:52 -- common/autotest_common.sh@10 -- # set +x 00:05:22.713 ************************************ 00:05:22.713 START TEST env_pci 00:05:22.713 ************************************ 00:05:22.713 06:47:52 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:22.713 00:05:22.713 00:05:22.713 CUnit - A unit testing framework for C - Version 2.1-3 00:05:22.713 http://cunit.sourceforge.net/ 00:05:22.713 00:05:22.713 00:05:22.713 Suite: pci 00:05:22.713 Test: pci_hook ...[2024-04-27 06:47:52.598219] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2595133 has claimed it 00:05:22.973 EAL: Cannot find device (10000:00:01.0) 00:05:22.973 EAL: Failed to attach device on primary process 00:05:22.973 passed 00:05:22.973 00:05:22.973 Run Summary: Type Total Ran Passed Failed Inactive 00:05:22.973 suites 1 1 n/a 0 0 00:05:22.973 tests 1 1 1 0 0 00:05:22.973 asserts 25 25 25 0 n/a 00:05:22.973 00:05:22.973 Elapsed time = 0.036 seconds 00:05:22.973 00:05:22.973 real 0m0.054s 00:05:22.973 user 0m0.014s 00:05:22.973 sys 0m0.040s 00:05:22.973 06:47:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:22.973 06:47:52 -- common/autotest_common.sh@10 -- # set +x 00:05:22.973 ************************************ 00:05:22.973 END TEST env_pci 00:05:22.973 ************************************ 00:05:22.973 06:47:52 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:22.973 06:47:52 -- env/env.sh@15 -- # uname 00:05:22.973 06:47:52 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:22.973 06:47:52 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:22.973 06:47:52 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:22.973 06:47:52 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:05:22.973 06:47:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:22.973 06:47:52 -- common/autotest_common.sh@10 -- # set +x 00:05:22.973 ************************************ 00:05:22.973 START TEST env_dpdk_post_init 00:05:22.973 ************************************ 00:05:22.973 06:47:52 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:22.973 EAL: Detected CPU lcores: 112 00:05:22.973 EAL: Detected NUMA nodes: 2 00:05:22.973 EAL: Detected static linkage of DPDK 00:05:22.973 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:22.973 EAL: Selected IOVA mode 'VA' 00:05:22.973 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.973 EAL: VFIO support initialized 00:05:22.973 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:22.973 EAL: Using IOMMU type 1 (Type 1) 00:05:23.911 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:27.202 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:27.202 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:27.772 Starting DPDK initialization... 00:05:27.772 Starting SPDK post initialization... 00:05:27.772 SPDK NVMe probe 00:05:27.772 Attaching to 0000:d8:00.0 00:05:27.772 Attached to 0000:d8:00.0 00:05:27.772 Cleaning up... 00:05:27.772 00:05:27.772 real 0m4.739s 00:05:27.772 user 0m3.540s 00:05:27.772 sys 0m0.441s 00:05:27.772 06:47:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.772 06:47:57 -- common/autotest_common.sh@10 -- # set +x 00:05:27.772 ************************************ 00:05:27.772 END TEST env_dpdk_post_init 00:05:27.772 ************************************ 00:05:27.772 06:47:57 -- env/env.sh@26 -- # uname 00:05:27.772 06:47:57 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:27.772 06:47:57 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:27.772 06:47:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:27.772 06:47:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:27.772 06:47:57 -- common/autotest_common.sh@10 -- # set +x 00:05:27.772 ************************************ 00:05:27.772 START TEST env_mem_callbacks 00:05:27.772 ************************************ 00:05:27.772 06:47:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:27.772 EAL: Detected CPU lcores: 112 00:05:27.772 EAL: Detected NUMA nodes: 2 00:05:27.772 EAL: Detected static linkage of DPDK 00:05:27.772 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:27.772 EAL: Selected IOVA mode 'VA' 00:05:27.772 EAL: No free 2048 kB hugepages reported on node 1 00:05:27.772 EAL: VFIO support initialized 00:05:27.772 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:27.772 00:05:27.772 00:05:27.772 CUnit - A unit testing framework for C - Version 2.1-3 00:05:27.772 http://cunit.sourceforge.net/ 00:05:27.772 00:05:27.772 00:05:27.772 Suite: memory 00:05:27.772 Test: test ... 00:05:27.772 register 0x200000200000 2097152 00:05:27.772 malloc 3145728 00:05:27.772 register 0x200000400000 4194304 00:05:27.772 buf 0x200000500000 len 3145728 PASSED 00:05:27.772 malloc 64 00:05:27.772 buf 0x2000004fff40 len 64 PASSED 00:05:27.772 malloc 4194304 00:05:27.772 register 0x200000800000 6291456 00:05:27.772 buf 0x200000a00000 len 4194304 PASSED 00:05:27.772 free 0x200000500000 3145728 00:05:27.772 free 0x2000004fff40 64 00:05:27.772 unregister 0x200000400000 4194304 PASSED 00:05:27.772 free 0x200000a00000 4194304 00:05:27.772 unregister 0x200000800000 6291456 PASSED 00:05:27.772 malloc 8388608 00:05:27.772 register 0x200000400000 10485760 00:05:27.772 buf 0x200000600000 len 8388608 PASSED 00:05:27.772 free 0x200000600000 8388608 00:05:27.772 unregister 0x200000400000 10485760 PASSED 00:05:27.772 passed 00:05:27.772 00:05:27.772 Run Summary: Type Total Ran Passed Failed Inactive 00:05:27.772 suites 1 1 n/a 0 0 00:05:27.772 tests 1 1 1 0 0 00:05:27.772 asserts 15 15 15 0 n/a 00:05:27.772 00:05:27.772 Elapsed time = 0.005 seconds 00:05:27.772 00:05:27.772 real 0m0.061s 00:05:27.772 user 0m0.017s 00:05:27.772 sys 0m0.044s 00:05:27.772 06:47:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.772 06:47:57 -- common/autotest_common.sh@10 -- # set +x 00:05:27.772 ************************************ 00:05:27.772 END TEST env_mem_callbacks 00:05:27.772 ************************************ 00:05:27.772 00:05:27.772 real 0m6.364s 00:05:27.772 user 0m4.390s 00:05:27.772 sys 0m1.240s 00:05:27.772 06:47:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.772 06:47:57 -- common/autotest_common.sh@10 -- # set +x 00:05:27.772 ************************************ 00:05:27.772 END TEST env 00:05:27.772 ************************************ 00:05:27.772 06:47:57 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:27.772 06:47:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:27.772 06:47:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:27.772 06:47:57 -- common/autotest_common.sh@10 -- # set +x 00:05:27.772 ************************************ 00:05:27.772 START TEST rpc 00:05:27.772 ************************************ 00:05:27.772 06:47:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:28.033 * Looking for test storage... 00:05:28.033 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:28.033 06:47:57 -- rpc/rpc.sh@65 -- # spdk_pid=2596201 00:05:28.033 06:47:57 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:28.033 06:47:57 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:28.033 06:47:57 -- rpc/rpc.sh@67 -- # waitforlisten 2596201 00:05:28.033 06:47:57 -- common/autotest_common.sh@819 -- # '[' -z 2596201 ']' 00:05:28.033 06:47:57 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:28.033 06:47:57 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:28.033 06:47:57 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:28.033 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:28.033 06:47:57 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:28.033 06:47:57 -- common/autotest_common.sh@10 -- # set +x 00:05:28.033 [2024-04-27 06:47:57.739073] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:28.033 [2024-04-27 06:47:57.739160] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2596201 ] 00:05:28.033 EAL: No free 2048 kB hugepages reported on node 1 00:05:28.033 [2024-04-27 06:47:57.808033] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:28.033 [2024-04-27 06:47:57.845108] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:28.033 [2024-04-27 06:47:57.845215] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:28.033 [2024-04-27 06:47:57.845225] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2596201' to capture a snapshot of events at runtime. 00:05:28.033 [2024-04-27 06:47:57.845234] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2596201 for offline analysis/debug. 00:05:28.033 [2024-04-27 06:47:57.845258] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.035 06:47:58 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:29.035 06:47:58 -- common/autotest_common.sh@852 -- # return 0 00:05:29.035 06:47:58 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:29.035 06:47:58 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:29.035 06:47:58 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:29.035 06:47:58 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:29.035 06:47:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.035 06:47:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.035 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.035 ************************************ 00:05:29.035 START TEST rpc_integrity 00:05:29.035 ************************************ 00:05:29.035 06:47:58 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:29.035 06:47:58 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:29.035 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.035 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.035 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.035 06:47:58 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:29.035 06:47:58 -- rpc/rpc.sh@13 -- # jq length 00:05:29.035 06:47:58 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:29.035 06:47:58 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:29.035 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.035 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.035 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.035 06:47:58 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:29.035 06:47:58 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:29.035 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.035 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.035 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.035 06:47:58 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:29.035 { 00:05:29.035 "name": "Malloc0", 00:05:29.035 "aliases": [ 00:05:29.035 "7c4858db-b3d2-4b31-b3cc-4912e61dc006" 00:05:29.035 ], 00:05:29.035 "product_name": "Malloc disk", 00:05:29.035 "block_size": 512, 00:05:29.035 "num_blocks": 16384, 00:05:29.035 "uuid": "7c4858db-b3d2-4b31-b3cc-4912e61dc006", 00:05:29.035 "assigned_rate_limits": { 00:05:29.036 "rw_ios_per_sec": 0, 00:05:29.036 "rw_mbytes_per_sec": 0, 00:05:29.036 "r_mbytes_per_sec": 0, 00:05:29.036 "w_mbytes_per_sec": 0 00:05:29.036 }, 00:05:29.036 "claimed": false, 00:05:29.036 "zoned": false, 00:05:29.036 "supported_io_types": { 00:05:29.036 "read": true, 00:05:29.036 "write": true, 00:05:29.036 "unmap": true, 00:05:29.036 "write_zeroes": true, 00:05:29.036 "flush": true, 00:05:29.036 "reset": true, 00:05:29.036 "compare": false, 00:05:29.036 "compare_and_write": false, 00:05:29.036 "abort": true, 00:05:29.036 "nvme_admin": false, 00:05:29.036 "nvme_io": false 00:05:29.036 }, 00:05:29.036 "memory_domains": [ 00:05:29.036 { 00:05:29.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.036 "dma_device_type": 2 00:05:29.036 } 00:05:29.036 ], 00:05:29.036 "driver_specific": {} 00:05:29.036 } 00:05:29.036 ]' 00:05:29.036 06:47:58 -- rpc/rpc.sh@17 -- # jq length 00:05:29.036 06:47:58 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:29.036 06:47:58 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:29.036 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.036 [2024-04-27 06:47:58.688670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:29.036 [2024-04-27 06:47:58.688704] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:29.036 [2024-04-27 06:47:58.688720] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5c48090 00:05:29.036 [2024-04-27 06:47:58.688729] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:29.036 [2024-04-27 06:47:58.689534] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:29.036 [2024-04-27 06:47:58.689555] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:29.036 Passthru0 00:05:29.036 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.036 06:47:58 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:29.036 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.036 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.036 06:47:58 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:29.036 { 00:05:29.036 "name": "Malloc0", 00:05:29.036 "aliases": [ 00:05:29.036 "7c4858db-b3d2-4b31-b3cc-4912e61dc006" 00:05:29.036 ], 00:05:29.036 "product_name": "Malloc disk", 00:05:29.036 "block_size": 512, 00:05:29.036 "num_blocks": 16384, 00:05:29.036 "uuid": "7c4858db-b3d2-4b31-b3cc-4912e61dc006", 00:05:29.036 "assigned_rate_limits": { 00:05:29.036 "rw_ios_per_sec": 0, 00:05:29.036 "rw_mbytes_per_sec": 0, 00:05:29.036 "r_mbytes_per_sec": 0, 00:05:29.036 "w_mbytes_per_sec": 0 00:05:29.036 }, 00:05:29.036 "claimed": true, 00:05:29.036 "claim_type": "exclusive_write", 00:05:29.036 "zoned": false, 00:05:29.036 "supported_io_types": { 00:05:29.036 "read": true, 00:05:29.036 "write": true, 00:05:29.036 "unmap": true, 00:05:29.036 "write_zeroes": true, 00:05:29.036 "flush": true, 00:05:29.036 "reset": true, 00:05:29.036 "compare": false, 00:05:29.036 "compare_and_write": false, 00:05:29.036 "abort": true, 00:05:29.036 "nvme_admin": false, 00:05:29.036 "nvme_io": false 00:05:29.036 }, 00:05:29.036 "memory_domains": [ 00:05:29.036 { 00:05:29.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.036 "dma_device_type": 2 00:05:29.036 } 00:05:29.036 ], 00:05:29.036 "driver_specific": {} 00:05:29.036 }, 00:05:29.036 { 00:05:29.036 "name": "Passthru0", 00:05:29.036 "aliases": [ 00:05:29.036 "fd905487-af06-5efb-8f28-3361533ffac8" 00:05:29.036 ], 00:05:29.036 "product_name": "passthru", 00:05:29.036 "block_size": 512, 00:05:29.036 "num_blocks": 16384, 00:05:29.036 "uuid": "fd905487-af06-5efb-8f28-3361533ffac8", 00:05:29.036 "assigned_rate_limits": { 00:05:29.036 "rw_ios_per_sec": 0, 00:05:29.036 "rw_mbytes_per_sec": 0, 00:05:29.036 "r_mbytes_per_sec": 0, 00:05:29.036 "w_mbytes_per_sec": 0 00:05:29.036 }, 00:05:29.036 "claimed": false, 00:05:29.036 "zoned": false, 00:05:29.036 "supported_io_types": { 00:05:29.036 "read": true, 00:05:29.036 "write": true, 00:05:29.036 "unmap": true, 00:05:29.036 "write_zeroes": true, 00:05:29.036 "flush": true, 00:05:29.036 "reset": true, 00:05:29.036 "compare": false, 00:05:29.036 "compare_and_write": false, 00:05:29.036 "abort": true, 00:05:29.036 "nvme_admin": false, 00:05:29.036 "nvme_io": false 00:05:29.036 }, 00:05:29.036 "memory_domains": [ 00:05:29.036 { 00:05:29.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.036 "dma_device_type": 2 00:05:29.036 } 00:05:29.036 ], 00:05:29.036 "driver_specific": { 00:05:29.036 "passthru": { 00:05:29.036 "name": "Passthru0", 00:05:29.036 "base_bdev_name": "Malloc0" 00:05:29.036 } 00:05:29.036 } 00:05:29.036 } 00:05:29.036 ]' 00:05:29.036 06:47:58 -- rpc/rpc.sh@21 -- # jq length 00:05:29.036 06:47:58 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:29.036 06:47:58 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:29.036 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.036 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.036 06:47:58 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:29.036 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.036 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.036 06:47:58 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:29.036 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.036 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.036 06:47:58 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:29.036 06:47:58 -- rpc/rpc.sh@26 -- # jq length 00:05:29.036 06:47:58 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:29.036 00:05:29.036 real 0m0.270s 00:05:29.036 user 0m0.166s 00:05:29.036 sys 0m0.044s 00:05:29.036 06:47:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.036 ************************************ 00:05:29.036 END TEST rpc_integrity 00:05:29.036 ************************************ 00:05:29.036 06:47:58 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:29.036 06:47:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.036 06:47:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.036 ************************************ 00:05:29.036 START TEST rpc_plugins 00:05:29.036 ************************************ 00:05:29.036 06:47:58 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:29.036 06:47:58 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:29.036 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.036 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.036 06:47:58 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:29.036 06:47:58 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:29.036 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.036 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.301 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.301 06:47:58 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:29.301 { 00:05:29.301 "name": "Malloc1", 00:05:29.301 "aliases": [ 00:05:29.301 "2091fa92-9fea-4fb7-90d0-4bafd979daa6" 00:05:29.301 ], 00:05:29.301 "product_name": "Malloc disk", 00:05:29.301 "block_size": 4096, 00:05:29.301 "num_blocks": 256, 00:05:29.301 "uuid": "2091fa92-9fea-4fb7-90d0-4bafd979daa6", 00:05:29.301 "assigned_rate_limits": { 00:05:29.301 "rw_ios_per_sec": 0, 00:05:29.301 "rw_mbytes_per_sec": 0, 00:05:29.301 "r_mbytes_per_sec": 0, 00:05:29.301 "w_mbytes_per_sec": 0 00:05:29.301 }, 00:05:29.301 "claimed": false, 00:05:29.301 "zoned": false, 00:05:29.301 "supported_io_types": { 00:05:29.301 "read": true, 00:05:29.301 "write": true, 00:05:29.301 "unmap": true, 00:05:29.301 "write_zeroes": true, 00:05:29.301 "flush": true, 00:05:29.301 "reset": true, 00:05:29.301 "compare": false, 00:05:29.301 "compare_and_write": false, 00:05:29.301 "abort": true, 00:05:29.301 "nvme_admin": false, 00:05:29.301 "nvme_io": false 00:05:29.301 }, 00:05:29.301 "memory_domains": [ 00:05:29.301 { 00:05:29.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.301 "dma_device_type": 2 00:05:29.301 } 00:05:29.301 ], 00:05:29.301 "driver_specific": {} 00:05:29.301 } 00:05:29.301 ]' 00:05:29.301 06:47:58 -- rpc/rpc.sh@32 -- # jq length 00:05:29.301 06:47:58 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:29.301 06:47:58 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:29.301 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.301 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.301 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.301 06:47:58 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:29.301 06:47:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.301 06:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.301 06:47:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.301 06:47:58 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:29.301 06:47:58 -- rpc/rpc.sh@36 -- # jq length 00:05:29.301 06:47:59 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:29.301 00:05:29.301 real 0m0.141s 00:05:29.301 user 0m0.085s 00:05:29.301 sys 0m0.018s 00:05:29.301 06:47:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.301 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.301 ************************************ 00:05:29.301 END TEST rpc_plugins 00:05:29.301 ************************************ 00:05:29.301 06:47:59 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:29.301 06:47:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.301 06:47:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.301 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.301 ************************************ 00:05:29.301 START TEST rpc_trace_cmd_test 00:05:29.301 ************************************ 00:05:29.301 06:47:59 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:29.301 06:47:59 -- rpc/rpc.sh@40 -- # local info 00:05:29.301 06:47:59 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:29.301 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.301 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.301 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.301 06:47:59 -- rpc/rpc.sh@42 -- # info='{ 00:05:29.301 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2596201", 00:05:29.301 "tpoint_group_mask": "0x8", 00:05:29.301 "iscsi_conn": { 00:05:29.301 "mask": "0x2", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "scsi": { 00:05:29.301 "mask": "0x4", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "bdev": { 00:05:29.301 "mask": "0x8", 00:05:29.301 "tpoint_mask": "0xffffffffffffffff" 00:05:29.301 }, 00:05:29.301 "nvmf_rdma": { 00:05:29.301 "mask": "0x10", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "nvmf_tcp": { 00:05:29.301 "mask": "0x20", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "ftl": { 00:05:29.301 "mask": "0x40", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "blobfs": { 00:05:29.301 "mask": "0x80", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "dsa": { 00:05:29.301 "mask": "0x200", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "thread": { 00:05:29.301 "mask": "0x400", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "nvme_pcie": { 00:05:29.301 "mask": "0x800", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "iaa": { 00:05:29.301 "mask": "0x1000", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "nvme_tcp": { 00:05:29.301 "mask": "0x2000", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 }, 00:05:29.301 "bdev_nvme": { 00:05:29.301 "mask": "0x4000", 00:05:29.301 "tpoint_mask": "0x0" 00:05:29.301 } 00:05:29.301 }' 00:05:29.301 06:47:59 -- rpc/rpc.sh@43 -- # jq length 00:05:29.301 06:47:59 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:29.301 06:47:59 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:29.301 06:47:59 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:29.301 06:47:59 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:29.560 06:47:59 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:29.560 06:47:59 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:29.560 06:47:59 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:29.560 06:47:59 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:29.560 06:47:59 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:29.561 00:05:29.561 real 0m0.215s 00:05:29.561 user 0m0.165s 00:05:29.561 sys 0m0.040s 00:05:29.561 06:47:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.561 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.561 ************************************ 00:05:29.561 END TEST rpc_trace_cmd_test 00:05:29.561 ************************************ 00:05:29.561 06:47:59 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:29.561 06:47:59 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:29.561 06:47:59 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:29.561 06:47:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.561 06:47:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.561 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.561 ************************************ 00:05:29.561 START TEST rpc_daemon_integrity 00:05:29.561 ************************************ 00:05:29.561 06:47:59 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:29.561 06:47:59 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:29.561 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.561 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.561 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.561 06:47:59 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:29.561 06:47:59 -- rpc/rpc.sh@13 -- # jq length 00:05:29.561 06:47:59 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:29.561 06:47:59 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:29.561 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.561 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.561 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.561 06:47:59 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:29.561 06:47:59 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:29.561 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.561 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.561 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.561 06:47:59 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:29.561 { 00:05:29.561 "name": "Malloc2", 00:05:29.561 "aliases": [ 00:05:29.561 "02eeb7e0-7672-40d7-be31-7b8071b8b4a5" 00:05:29.561 ], 00:05:29.561 "product_name": "Malloc disk", 00:05:29.561 "block_size": 512, 00:05:29.561 "num_blocks": 16384, 00:05:29.561 "uuid": "02eeb7e0-7672-40d7-be31-7b8071b8b4a5", 00:05:29.561 "assigned_rate_limits": { 00:05:29.561 "rw_ios_per_sec": 0, 00:05:29.561 "rw_mbytes_per_sec": 0, 00:05:29.561 "r_mbytes_per_sec": 0, 00:05:29.561 "w_mbytes_per_sec": 0 00:05:29.561 }, 00:05:29.561 "claimed": false, 00:05:29.561 "zoned": false, 00:05:29.561 "supported_io_types": { 00:05:29.561 "read": true, 00:05:29.561 "write": true, 00:05:29.561 "unmap": true, 00:05:29.561 "write_zeroes": true, 00:05:29.561 "flush": true, 00:05:29.561 "reset": true, 00:05:29.561 "compare": false, 00:05:29.561 "compare_and_write": false, 00:05:29.561 "abort": true, 00:05:29.561 "nvme_admin": false, 00:05:29.561 "nvme_io": false 00:05:29.561 }, 00:05:29.561 "memory_domains": [ 00:05:29.561 { 00:05:29.561 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.561 "dma_device_type": 2 00:05:29.561 } 00:05:29.561 ], 00:05:29.561 "driver_specific": {} 00:05:29.561 } 00:05:29.561 ]' 00:05:29.561 06:47:59 -- rpc/rpc.sh@17 -- # jq length 00:05:29.561 06:47:59 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:29.561 06:47:59 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:29.561 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.561 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.820 [2024-04-27 06:47:59.458650] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:29.820 [2024-04-27 06:47:59.458680] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:29.820 [2024-04-27 06:47:59.458700] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5dd1980 00:05:29.820 [2024-04-27 06:47:59.458709] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:29.820 [2024-04-27 06:47:59.459401] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:29.820 [2024-04-27 06:47:59.459421] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:29.820 Passthru0 00:05:29.820 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.820 06:47:59 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:29.820 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.820 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.820 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.820 06:47:59 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:29.820 { 00:05:29.820 "name": "Malloc2", 00:05:29.821 "aliases": [ 00:05:29.821 "02eeb7e0-7672-40d7-be31-7b8071b8b4a5" 00:05:29.821 ], 00:05:29.821 "product_name": "Malloc disk", 00:05:29.821 "block_size": 512, 00:05:29.821 "num_blocks": 16384, 00:05:29.821 "uuid": "02eeb7e0-7672-40d7-be31-7b8071b8b4a5", 00:05:29.821 "assigned_rate_limits": { 00:05:29.821 "rw_ios_per_sec": 0, 00:05:29.821 "rw_mbytes_per_sec": 0, 00:05:29.821 "r_mbytes_per_sec": 0, 00:05:29.821 "w_mbytes_per_sec": 0 00:05:29.821 }, 00:05:29.821 "claimed": true, 00:05:29.821 "claim_type": "exclusive_write", 00:05:29.821 "zoned": false, 00:05:29.821 "supported_io_types": { 00:05:29.821 "read": true, 00:05:29.821 "write": true, 00:05:29.821 "unmap": true, 00:05:29.821 "write_zeroes": true, 00:05:29.821 "flush": true, 00:05:29.821 "reset": true, 00:05:29.821 "compare": false, 00:05:29.821 "compare_and_write": false, 00:05:29.821 "abort": true, 00:05:29.821 "nvme_admin": false, 00:05:29.821 "nvme_io": false 00:05:29.821 }, 00:05:29.821 "memory_domains": [ 00:05:29.821 { 00:05:29.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.821 "dma_device_type": 2 00:05:29.821 } 00:05:29.821 ], 00:05:29.821 "driver_specific": {} 00:05:29.821 }, 00:05:29.821 { 00:05:29.821 "name": "Passthru0", 00:05:29.821 "aliases": [ 00:05:29.821 "56425cb3-6208-5f41-9714-aac19481a8db" 00:05:29.821 ], 00:05:29.821 "product_name": "passthru", 00:05:29.821 "block_size": 512, 00:05:29.821 "num_blocks": 16384, 00:05:29.821 "uuid": "56425cb3-6208-5f41-9714-aac19481a8db", 00:05:29.821 "assigned_rate_limits": { 00:05:29.821 "rw_ios_per_sec": 0, 00:05:29.821 "rw_mbytes_per_sec": 0, 00:05:29.821 "r_mbytes_per_sec": 0, 00:05:29.821 "w_mbytes_per_sec": 0 00:05:29.821 }, 00:05:29.821 "claimed": false, 00:05:29.821 "zoned": false, 00:05:29.821 "supported_io_types": { 00:05:29.821 "read": true, 00:05:29.821 "write": true, 00:05:29.821 "unmap": true, 00:05:29.821 "write_zeroes": true, 00:05:29.821 "flush": true, 00:05:29.821 "reset": true, 00:05:29.821 "compare": false, 00:05:29.821 "compare_and_write": false, 00:05:29.821 "abort": true, 00:05:29.821 "nvme_admin": false, 00:05:29.821 "nvme_io": false 00:05:29.821 }, 00:05:29.821 "memory_domains": [ 00:05:29.821 { 00:05:29.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.821 "dma_device_type": 2 00:05:29.821 } 00:05:29.821 ], 00:05:29.821 "driver_specific": { 00:05:29.821 "passthru": { 00:05:29.821 "name": "Passthru0", 00:05:29.821 "base_bdev_name": "Malloc2" 00:05:29.821 } 00:05:29.821 } 00:05:29.821 } 00:05:29.821 ]' 00:05:29.821 06:47:59 -- rpc/rpc.sh@21 -- # jq length 00:05:29.821 06:47:59 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:29.821 06:47:59 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:29.821 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.821 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.821 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.821 06:47:59 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:29.821 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.821 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.821 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.821 06:47:59 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:29.821 06:47:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.821 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.821 06:47:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.821 06:47:59 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:29.821 06:47:59 -- rpc/rpc.sh@26 -- # jq length 00:05:29.821 06:47:59 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:29.821 00:05:29.821 real 0m0.246s 00:05:29.821 user 0m0.147s 00:05:29.821 sys 0m0.037s 00:05:29.821 06:47:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.821 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.821 ************************************ 00:05:29.821 END TEST rpc_daemon_integrity 00:05:29.821 ************************************ 00:05:29.821 06:47:59 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:29.821 06:47:59 -- rpc/rpc.sh@84 -- # killprocess 2596201 00:05:29.821 06:47:59 -- common/autotest_common.sh@926 -- # '[' -z 2596201 ']' 00:05:29.821 06:47:59 -- common/autotest_common.sh@930 -- # kill -0 2596201 00:05:29.821 06:47:59 -- common/autotest_common.sh@931 -- # uname 00:05:29.821 06:47:59 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:29.821 06:47:59 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2596201 00:05:29.821 06:47:59 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:29.821 06:47:59 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:29.821 06:47:59 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2596201' 00:05:29.821 killing process with pid 2596201 00:05:29.821 06:47:59 -- common/autotest_common.sh@945 -- # kill 2596201 00:05:29.821 06:47:59 -- common/autotest_common.sh@950 -- # wait 2596201 00:05:30.080 00:05:30.080 real 0m2.356s 00:05:30.080 user 0m2.970s 00:05:30.080 sys 0m0.693s 00:05:30.080 06:47:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.080 06:47:59 -- common/autotest_common.sh@10 -- # set +x 00:05:30.080 ************************************ 00:05:30.080 END TEST rpc 00:05:30.080 ************************************ 00:05:30.339 06:48:00 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:30.339 06:48:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:30.339 06:48:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.339 06:48:00 -- common/autotest_common.sh@10 -- # set +x 00:05:30.339 ************************************ 00:05:30.339 START TEST rpc_client 00:05:30.339 ************************************ 00:05:30.339 06:48:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:30.339 * Looking for test storage... 00:05:30.339 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:30.339 06:48:00 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:30.339 OK 00:05:30.339 06:48:00 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:30.339 00:05:30.339 real 0m0.113s 00:05:30.339 user 0m0.042s 00:05:30.339 sys 0m0.078s 00:05:30.339 06:48:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.339 06:48:00 -- common/autotest_common.sh@10 -- # set +x 00:05:30.339 ************************************ 00:05:30.339 END TEST rpc_client 00:05:30.339 ************************************ 00:05:30.339 06:48:00 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:30.339 06:48:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:30.339 06:48:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.339 06:48:00 -- common/autotest_common.sh@10 -- # set +x 00:05:30.339 ************************************ 00:05:30.339 START TEST json_config 00:05:30.339 ************************************ 00:05:30.339 06:48:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:30.599 06:48:00 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:30.599 06:48:00 -- nvmf/common.sh@7 -- # uname -s 00:05:30.599 06:48:00 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:30.599 06:48:00 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:30.599 06:48:00 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:30.599 06:48:00 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:30.599 06:48:00 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:30.599 06:48:00 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:30.599 06:48:00 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:30.599 06:48:00 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:30.599 06:48:00 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:30.599 06:48:00 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:30.599 06:48:00 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:30.599 06:48:00 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:30.599 06:48:00 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:30.599 06:48:00 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:30.599 06:48:00 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:30.599 06:48:00 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:30.599 06:48:00 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:30.599 06:48:00 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:30.599 06:48:00 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:30.599 06:48:00 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:30.599 06:48:00 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:30.599 06:48:00 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:30.599 06:48:00 -- paths/export.sh@5 -- # export PATH 00:05:30.599 06:48:00 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:30.599 06:48:00 -- nvmf/common.sh@46 -- # : 0 00:05:30.599 06:48:00 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:30.599 06:48:00 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:30.599 06:48:00 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:30.599 06:48:00 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:30.599 06:48:00 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:30.599 06:48:00 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:30.599 06:48:00 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:30.599 06:48:00 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:30.599 06:48:00 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:30.599 06:48:00 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:30.599 06:48:00 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:30.599 06:48:00 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:30.599 06:48:00 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:30.599 WARNING: No tests are enabled so not running JSON configuration tests 00:05:30.599 06:48:00 -- json_config/json_config.sh@27 -- # exit 0 00:05:30.599 00:05:30.599 real 0m0.101s 00:05:30.599 user 0m0.056s 00:05:30.599 sys 0m0.046s 00:05:30.599 06:48:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.599 06:48:00 -- common/autotest_common.sh@10 -- # set +x 00:05:30.599 ************************************ 00:05:30.599 END TEST json_config 00:05:30.599 ************************************ 00:05:30.599 06:48:00 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:30.599 06:48:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:30.599 06:48:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.599 06:48:00 -- common/autotest_common.sh@10 -- # set +x 00:05:30.599 ************************************ 00:05:30.599 START TEST json_config_extra_key 00:05:30.599 ************************************ 00:05:30.599 06:48:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:30.599 06:48:00 -- nvmf/common.sh@7 -- # uname -s 00:05:30.599 06:48:00 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:30.599 06:48:00 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:30.599 06:48:00 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:30.599 06:48:00 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:30.599 06:48:00 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:30.599 06:48:00 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:30.599 06:48:00 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:30.599 06:48:00 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:30.599 06:48:00 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:30.599 06:48:00 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:30.599 06:48:00 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:30.599 06:48:00 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:30.599 06:48:00 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:30.599 06:48:00 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:30.599 06:48:00 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:30.599 06:48:00 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:30.599 06:48:00 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:30.599 06:48:00 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:30.599 06:48:00 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:30.599 06:48:00 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:30.599 06:48:00 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:30.599 06:48:00 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:30.599 06:48:00 -- paths/export.sh@5 -- # export PATH 00:05:30.599 06:48:00 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:30.599 06:48:00 -- nvmf/common.sh@46 -- # : 0 00:05:30.599 06:48:00 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:30.599 06:48:00 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:30.599 06:48:00 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:30.599 06:48:00 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:30.599 06:48:00 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:30.599 06:48:00 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:30.599 06:48:00 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:30.599 06:48:00 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:30.599 INFO: launching applications... 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:30.599 06:48:00 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:30.600 06:48:00 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:30.600 06:48:00 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=2596922 00:05:30.600 06:48:00 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:30.600 Waiting for target to run... 00:05:30.600 06:48:00 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 2596922 /var/tmp/spdk_tgt.sock 00:05:30.600 06:48:00 -- common/autotest_common.sh@819 -- # '[' -z 2596922 ']' 00:05:30.600 06:48:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:30.600 06:48:00 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:30.600 06:48:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:30.600 06:48:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:30.600 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:30.600 06:48:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:30.600 06:48:00 -- common/autotest_common.sh@10 -- # set +x 00:05:30.600 [2024-04-27 06:48:00.455125] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:30.600 [2024-04-27 06:48:00.455206] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2596922 ] 00:05:30.600 EAL: No free 2048 kB hugepages reported on node 1 00:05:31.167 [2024-04-27 06:48:00.884160] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:31.167 [2024-04-27 06:48:00.911489] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:31.167 [2024-04-27 06:48:00.911599] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:31.426 06:48:01 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:31.426 06:48:01 -- common/autotest_common.sh@852 -- # return 0 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:31.426 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:31.426 INFO: shutting down applications... 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 2596922 ]] 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 2596922 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2596922 00:05:31.426 06:48:01 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:31.994 06:48:01 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:31.994 06:48:01 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:31.994 06:48:01 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2596922 00:05:31.994 06:48:01 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:31.994 06:48:01 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:31.994 06:48:01 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:31.994 06:48:01 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:31.994 SPDK target shutdown done 00:05:31.994 06:48:01 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:31.994 Success 00:05:31.994 00:05:31.994 real 0m1.452s 00:05:31.994 user 0m1.042s 00:05:31.994 sys 0m0.538s 00:05:31.994 06:48:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:31.994 06:48:01 -- common/autotest_common.sh@10 -- # set +x 00:05:31.994 ************************************ 00:05:31.994 END TEST json_config_extra_key 00:05:31.994 ************************************ 00:05:31.994 06:48:01 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:31.994 06:48:01 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:31.994 06:48:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:31.994 06:48:01 -- common/autotest_common.sh@10 -- # set +x 00:05:31.994 ************************************ 00:05:31.994 START TEST alias_rpc 00:05:31.994 ************************************ 00:05:31.994 06:48:01 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:32.254 * Looking for test storage... 00:05:32.254 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:32.254 06:48:01 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:32.254 06:48:01 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.254 06:48:01 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2597308 00:05:32.254 06:48:01 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2597308 00:05:32.254 06:48:01 -- common/autotest_common.sh@819 -- # '[' -z 2597308 ']' 00:05:32.254 06:48:01 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:32.254 06:48:01 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:32.254 06:48:01 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:32.254 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:32.254 06:48:01 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:32.254 06:48:01 -- common/autotest_common.sh@10 -- # set +x 00:05:32.254 [2024-04-27 06:48:01.952679] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:32.254 [2024-04-27 06:48:01.952741] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2597308 ] 00:05:32.254 EAL: No free 2048 kB hugepages reported on node 1 00:05:32.254 [2024-04-27 06:48:02.019759] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:32.254 [2024-04-27 06:48:02.059917] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:32.254 [2024-04-27 06:48:02.060031] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.189 06:48:02 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:33.189 06:48:02 -- common/autotest_common.sh@852 -- # return 0 00:05:33.189 06:48:02 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:33.189 06:48:02 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2597308 00:05:33.189 06:48:02 -- common/autotest_common.sh@926 -- # '[' -z 2597308 ']' 00:05:33.189 06:48:02 -- common/autotest_common.sh@930 -- # kill -0 2597308 00:05:33.189 06:48:02 -- common/autotest_common.sh@931 -- # uname 00:05:33.189 06:48:02 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:33.189 06:48:02 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2597308 00:05:33.189 06:48:03 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:33.189 06:48:03 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:33.189 06:48:03 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2597308' 00:05:33.190 killing process with pid 2597308 00:05:33.190 06:48:03 -- common/autotest_common.sh@945 -- # kill 2597308 00:05:33.190 06:48:03 -- common/autotest_common.sh@950 -- # wait 2597308 00:05:33.448 00:05:33.448 real 0m1.485s 00:05:33.448 user 0m1.596s 00:05:33.448 sys 0m0.421s 00:05:33.448 06:48:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:33.448 06:48:03 -- common/autotest_common.sh@10 -- # set +x 00:05:33.448 ************************************ 00:05:33.448 END TEST alias_rpc 00:05:33.448 ************************************ 00:05:33.707 06:48:03 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:33.707 06:48:03 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:33.707 06:48:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:33.707 06:48:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:33.707 06:48:03 -- common/autotest_common.sh@10 -- # set +x 00:05:33.707 ************************************ 00:05:33.707 START TEST spdkcli_tcp 00:05:33.707 ************************************ 00:05:33.707 06:48:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:33.707 * Looking for test storage... 00:05:33.707 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:33.707 06:48:03 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:33.707 06:48:03 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:33.707 06:48:03 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:33.707 06:48:03 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:33.707 06:48:03 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:33.707 06:48:03 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:33.707 06:48:03 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:33.707 06:48:03 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:33.707 06:48:03 -- common/autotest_common.sh@10 -- # set +x 00:05:33.707 06:48:03 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2597620 00:05:33.707 06:48:03 -- spdkcli/tcp.sh@27 -- # waitforlisten 2597620 00:05:33.707 06:48:03 -- common/autotest_common.sh@819 -- # '[' -z 2597620 ']' 00:05:33.707 06:48:03 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:33.707 06:48:03 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:33.707 06:48:03 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:33.707 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:33.707 06:48:03 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:33.707 06:48:03 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:33.707 06:48:03 -- common/autotest_common.sh@10 -- # set +x 00:05:33.707 [2024-04-27 06:48:03.468142] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:33.707 [2024-04-27 06:48:03.468229] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2597620 ] 00:05:33.707 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.707 [2024-04-27 06:48:03.538896] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:33.707 [2024-04-27 06:48:03.576915] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:33.707 [2024-04-27 06:48:03.579526] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:33.707 [2024-04-27 06:48:03.579530] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.645 06:48:04 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:34.645 06:48:04 -- common/autotest_common.sh@852 -- # return 0 00:05:34.645 06:48:04 -- spdkcli/tcp.sh@31 -- # socat_pid=2597911 00:05:34.645 06:48:04 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:34.645 06:48:04 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:34.645 [ 00:05:34.645 "spdk_get_version", 00:05:34.645 "rpc_get_methods", 00:05:34.645 "trace_get_info", 00:05:34.645 "trace_get_tpoint_group_mask", 00:05:34.645 "trace_disable_tpoint_group", 00:05:34.645 "trace_enable_tpoint_group", 00:05:34.645 "trace_clear_tpoint_mask", 00:05:34.645 "trace_set_tpoint_mask", 00:05:34.645 "vfu_tgt_set_base_path", 00:05:34.645 "framework_get_pci_devices", 00:05:34.645 "framework_get_config", 00:05:34.645 "framework_get_subsystems", 00:05:34.645 "iobuf_get_stats", 00:05:34.645 "iobuf_set_options", 00:05:34.645 "sock_set_default_impl", 00:05:34.645 "sock_impl_set_options", 00:05:34.645 "sock_impl_get_options", 00:05:34.645 "vmd_rescan", 00:05:34.645 "vmd_remove_device", 00:05:34.645 "vmd_enable", 00:05:34.645 "accel_get_stats", 00:05:34.645 "accel_set_options", 00:05:34.645 "accel_set_driver", 00:05:34.645 "accel_crypto_key_destroy", 00:05:34.645 "accel_crypto_keys_get", 00:05:34.645 "accel_crypto_key_create", 00:05:34.645 "accel_assign_opc", 00:05:34.645 "accel_get_module_info", 00:05:34.645 "accel_get_opc_assignments", 00:05:34.645 "notify_get_notifications", 00:05:34.645 "notify_get_types", 00:05:34.645 "bdev_get_histogram", 00:05:34.645 "bdev_enable_histogram", 00:05:34.645 "bdev_set_qos_limit", 00:05:34.645 "bdev_set_qd_sampling_period", 00:05:34.645 "bdev_get_bdevs", 00:05:34.645 "bdev_reset_iostat", 00:05:34.645 "bdev_get_iostat", 00:05:34.645 "bdev_examine", 00:05:34.645 "bdev_wait_for_examine", 00:05:34.645 "bdev_set_options", 00:05:34.645 "scsi_get_devices", 00:05:34.645 "thread_set_cpumask", 00:05:34.645 "framework_get_scheduler", 00:05:34.645 "framework_set_scheduler", 00:05:34.645 "framework_get_reactors", 00:05:34.645 "thread_get_io_channels", 00:05:34.645 "thread_get_pollers", 00:05:34.645 "thread_get_stats", 00:05:34.645 "framework_monitor_context_switch", 00:05:34.645 "spdk_kill_instance", 00:05:34.645 "log_enable_timestamps", 00:05:34.645 "log_get_flags", 00:05:34.645 "log_clear_flag", 00:05:34.645 "log_set_flag", 00:05:34.645 "log_get_level", 00:05:34.645 "log_set_level", 00:05:34.645 "log_get_print_level", 00:05:34.645 "log_set_print_level", 00:05:34.645 "framework_enable_cpumask_locks", 00:05:34.645 "framework_disable_cpumask_locks", 00:05:34.645 "framework_wait_init", 00:05:34.645 "framework_start_init", 00:05:34.645 "virtio_blk_create_transport", 00:05:34.645 "virtio_blk_get_transports", 00:05:34.645 "vhost_controller_set_coalescing", 00:05:34.645 "vhost_get_controllers", 00:05:34.645 "vhost_delete_controller", 00:05:34.645 "vhost_create_blk_controller", 00:05:34.645 "vhost_scsi_controller_remove_target", 00:05:34.645 "vhost_scsi_controller_add_target", 00:05:34.645 "vhost_start_scsi_controller", 00:05:34.645 "vhost_create_scsi_controller", 00:05:34.645 "ublk_recover_disk", 00:05:34.645 "ublk_get_disks", 00:05:34.645 "ublk_stop_disk", 00:05:34.645 "ublk_start_disk", 00:05:34.645 "ublk_destroy_target", 00:05:34.645 "ublk_create_target", 00:05:34.645 "nbd_get_disks", 00:05:34.645 "nbd_stop_disk", 00:05:34.645 "nbd_start_disk", 00:05:34.645 "env_dpdk_get_mem_stats", 00:05:34.645 "nvmf_subsystem_get_listeners", 00:05:34.645 "nvmf_subsystem_get_qpairs", 00:05:34.646 "nvmf_subsystem_get_controllers", 00:05:34.646 "nvmf_get_stats", 00:05:34.646 "nvmf_get_transports", 00:05:34.646 "nvmf_create_transport", 00:05:34.646 "nvmf_get_targets", 00:05:34.646 "nvmf_delete_target", 00:05:34.646 "nvmf_create_target", 00:05:34.646 "nvmf_subsystem_allow_any_host", 00:05:34.646 "nvmf_subsystem_remove_host", 00:05:34.646 "nvmf_subsystem_add_host", 00:05:34.646 "nvmf_subsystem_remove_ns", 00:05:34.646 "nvmf_subsystem_add_ns", 00:05:34.646 "nvmf_subsystem_listener_set_ana_state", 00:05:34.646 "nvmf_discovery_get_referrals", 00:05:34.646 "nvmf_discovery_remove_referral", 00:05:34.646 "nvmf_discovery_add_referral", 00:05:34.646 "nvmf_subsystem_remove_listener", 00:05:34.646 "nvmf_subsystem_add_listener", 00:05:34.646 "nvmf_delete_subsystem", 00:05:34.646 "nvmf_create_subsystem", 00:05:34.646 "nvmf_get_subsystems", 00:05:34.646 "nvmf_set_crdt", 00:05:34.646 "nvmf_set_config", 00:05:34.646 "nvmf_set_max_subsystems", 00:05:34.646 "iscsi_set_options", 00:05:34.646 "iscsi_get_auth_groups", 00:05:34.646 "iscsi_auth_group_remove_secret", 00:05:34.646 "iscsi_auth_group_add_secret", 00:05:34.646 "iscsi_delete_auth_group", 00:05:34.646 "iscsi_create_auth_group", 00:05:34.646 "iscsi_set_discovery_auth", 00:05:34.646 "iscsi_get_options", 00:05:34.646 "iscsi_target_node_request_logout", 00:05:34.646 "iscsi_target_node_set_redirect", 00:05:34.646 "iscsi_target_node_set_auth", 00:05:34.646 "iscsi_target_node_add_lun", 00:05:34.646 "iscsi_get_connections", 00:05:34.646 "iscsi_portal_group_set_auth", 00:05:34.646 "iscsi_start_portal_group", 00:05:34.646 "iscsi_delete_portal_group", 00:05:34.646 "iscsi_create_portal_group", 00:05:34.646 "iscsi_get_portal_groups", 00:05:34.646 "iscsi_delete_target_node", 00:05:34.646 "iscsi_target_node_remove_pg_ig_maps", 00:05:34.646 "iscsi_target_node_add_pg_ig_maps", 00:05:34.646 "iscsi_create_target_node", 00:05:34.646 "iscsi_get_target_nodes", 00:05:34.646 "iscsi_delete_initiator_group", 00:05:34.646 "iscsi_initiator_group_remove_initiators", 00:05:34.646 "iscsi_initiator_group_add_initiators", 00:05:34.646 "iscsi_create_initiator_group", 00:05:34.646 "iscsi_get_initiator_groups", 00:05:34.646 "vfu_virtio_create_scsi_endpoint", 00:05:34.646 "vfu_virtio_scsi_remove_target", 00:05:34.646 "vfu_virtio_scsi_add_target", 00:05:34.646 "vfu_virtio_create_blk_endpoint", 00:05:34.646 "vfu_virtio_delete_endpoint", 00:05:34.646 "iaa_scan_accel_module", 00:05:34.646 "dsa_scan_accel_module", 00:05:34.646 "ioat_scan_accel_module", 00:05:34.646 "accel_error_inject_error", 00:05:34.646 "bdev_iscsi_delete", 00:05:34.646 "bdev_iscsi_create", 00:05:34.646 "bdev_iscsi_set_options", 00:05:34.646 "bdev_virtio_attach_controller", 00:05:34.646 "bdev_virtio_scsi_get_devices", 00:05:34.646 "bdev_virtio_detach_controller", 00:05:34.646 "bdev_virtio_blk_set_hotplug", 00:05:34.646 "bdev_ftl_set_property", 00:05:34.646 "bdev_ftl_get_properties", 00:05:34.646 "bdev_ftl_get_stats", 00:05:34.646 "bdev_ftl_unmap", 00:05:34.646 "bdev_ftl_unload", 00:05:34.646 "bdev_ftl_delete", 00:05:34.646 "bdev_ftl_load", 00:05:34.646 "bdev_ftl_create", 00:05:34.646 "bdev_aio_delete", 00:05:34.646 "bdev_aio_rescan", 00:05:34.646 "bdev_aio_create", 00:05:34.646 "blobfs_create", 00:05:34.646 "blobfs_detect", 00:05:34.646 "blobfs_set_cache_size", 00:05:34.646 "bdev_zone_block_delete", 00:05:34.646 "bdev_zone_block_create", 00:05:34.646 "bdev_delay_delete", 00:05:34.646 "bdev_delay_create", 00:05:34.646 "bdev_delay_update_latency", 00:05:34.646 "bdev_split_delete", 00:05:34.646 "bdev_split_create", 00:05:34.646 "bdev_error_inject_error", 00:05:34.646 "bdev_error_delete", 00:05:34.646 "bdev_error_create", 00:05:34.646 "bdev_raid_set_options", 00:05:34.646 "bdev_raid_remove_base_bdev", 00:05:34.646 "bdev_raid_add_base_bdev", 00:05:34.646 "bdev_raid_delete", 00:05:34.646 "bdev_raid_create", 00:05:34.646 "bdev_raid_get_bdevs", 00:05:34.646 "bdev_lvol_grow_lvstore", 00:05:34.646 "bdev_lvol_get_lvols", 00:05:34.646 "bdev_lvol_get_lvstores", 00:05:34.646 "bdev_lvol_delete", 00:05:34.646 "bdev_lvol_set_read_only", 00:05:34.646 "bdev_lvol_resize", 00:05:34.646 "bdev_lvol_decouple_parent", 00:05:34.646 "bdev_lvol_inflate", 00:05:34.646 "bdev_lvol_rename", 00:05:34.646 "bdev_lvol_clone_bdev", 00:05:34.646 "bdev_lvol_clone", 00:05:34.646 "bdev_lvol_snapshot", 00:05:34.646 "bdev_lvol_create", 00:05:34.646 "bdev_lvol_delete_lvstore", 00:05:34.646 "bdev_lvol_rename_lvstore", 00:05:34.646 "bdev_lvol_create_lvstore", 00:05:34.646 "bdev_passthru_delete", 00:05:34.646 "bdev_passthru_create", 00:05:34.646 "bdev_nvme_cuse_unregister", 00:05:34.646 "bdev_nvme_cuse_register", 00:05:34.646 "bdev_opal_new_user", 00:05:34.646 "bdev_opal_set_lock_state", 00:05:34.646 "bdev_opal_delete", 00:05:34.646 "bdev_opal_get_info", 00:05:34.646 "bdev_opal_create", 00:05:34.646 "bdev_nvme_opal_revert", 00:05:34.646 "bdev_nvme_opal_init", 00:05:34.646 "bdev_nvme_send_cmd", 00:05:34.646 "bdev_nvme_get_path_iostat", 00:05:34.646 "bdev_nvme_get_mdns_discovery_info", 00:05:34.646 "bdev_nvme_stop_mdns_discovery", 00:05:34.646 "bdev_nvme_start_mdns_discovery", 00:05:34.646 "bdev_nvme_set_multipath_policy", 00:05:34.646 "bdev_nvme_set_preferred_path", 00:05:34.646 "bdev_nvme_get_io_paths", 00:05:34.646 "bdev_nvme_remove_error_injection", 00:05:34.646 "bdev_nvme_add_error_injection", 00:05:34.646 "bdev_nvme_get_discovery_info", 00:05:34.646 "bdev_nvme_stop_discovery", 00:05:34.646 "bdev_nvme_start_discovery", 00:05:34.646 "bdev_nvme_get_controller_health_info", 00:05:34.646 "bdev_nvme_disable_controller", 00:05:34.646 "bdev_nvme_enable_controller", 00:05:34.646 "bdev_nvme_reset_controller", 00:05:34.646 "bdev_nvme_get_transport_statistics", 00:05:34.646 "bdev_nvme_apply_firmware", 00:05:34.646 "bdev_nvme_detach_controller", 00:05:34.646 "bdev_nvme_get_controllers", 00:05:34.646 "bdev_nvme_attach_controller", 00:05:34.646 "bdev_nvme_set_hotplug", 00:05:34.646 "bdev_nvme_set_options", 00:05:34.646 "bdev_null_resize", 00:05:34.646 "bdev_null_delete", 00:05:34.646 "bdev_null_create", 00:05:34.646 "bdev_malloc_delete", 00:05:34.646 "bdev_malloc_create" 00:05:34.646 ] 00:05:34.646 06:48:04 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:34.646 06:48:04 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:34.646 06:48:04 -- common/autotest_common.sh@10 -- # set +x 00:05:34.646 06:48:04 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:34.646 06:48:04 -- spdkcli/tcp.sh@38 -- # killprocess 2597620 00:05:34.646 06:48:04 -- common/autotest_common.sh@926 -- # '[' -z 2597620 ']' 00:05:34.646 06:48:04 -- common/autotest_common.sh@930 -- # kill -0 2597620 00:05:34.646 06:48:04 -- common/autotest_common.sh@931 -- # uname 00:05:34.646 06:48:04 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:34.646 06:48:04 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2597620 00:05:34.905 06:48:04 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:34.905 06:48:04 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:34.905 06:48:04 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2597620' 00:05:34.905 killing process with pid 2597620 00:05:34.905 06:48:04 -- common/autotest_common.sh@945 -- # kill 2597620 00:05:34.905 06:48:04 -- common/autotest_common.sh@950 -- # wait 2597620 00:05:35.165 00:05:35.165 real 0m1.497s 00:05:35.165 user 0m2.849s 00:05:35.165 sys 0m0.476s 00:05:35.165 06:48:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.165 06:48:04 -- common/autotest_common.sh@10 -- # set +x 00:05:35.165 ************************************ 00:05:35.165 END TEST spdkcli_tcp 00:05:35.165 ************************************ 00:05:35.165 06:48:04 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:35.165 06:48:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.165 06:48:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.165 06:48:04 -- common/autotest_common.sh@10 -- # set +x 00:05:35.165 ************************************ 00:05:35.165 START TEST dpdk_mem_utility 00:05:35.165 ************************************ 00:05:35.165 06:48:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:35.165 * Looking for test storage... 00:05:35.165 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:35.165 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:35.165 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2598271 00:05:35.165 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:35.165 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2598271 00:05:35.165 06:48:05 -- common/autotest_common.sh@819 -- # '[' -z 2598271 ']' 00:05:35.165 06:48:05 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.165 06:48:05 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:35.165 06:48:05 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.165 06:48:05 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:35.165 06:48:05 -- common/autotest_common.sh@10 -- # set +x 00:05:35.165 [2024-04-27 06:48:05.031199] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:35.165 [2024-04-27 06:48:05.031267] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2598271 ] 00:05:35.424 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.424 [2024-04-27 06:48:05.100777] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.424 [2024-04-27 06:48:05.138512] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:35.424 [2024-04-27 06:48:05.138634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.989 06:48:05 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:35.989 06:48:05 -- common/autotest_common.sh@852 -- # return 0 00:05:35.989 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:35.989 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:35.989 06:48:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:35.989 06:48:05 -- common/autotest_common.sh@10 -- # set +x 00:05:35.989 { 00:05:35.989 "filename": "/tmp/spdk_mem_dump.txt" 00:05:35.989 } 00:05:35.989 06:48:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:35.989 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:36.249 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:36.249 1 heaps totaling size 814.000000 MiB 00:05:36.249 size: 814.000000 MiB heap id: 0 00:05:36.249 end heaps---------- 00:05:36.249 8 mempools totaling size 598.116089 MiB 00:05:36.249 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:36.249 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:36.249 size: 84.521057 MiB name: bdev_io_2598271 00:05:36.249 size: 51.011292 MiB name: evtpool_2598271 00:05:36.249 size: 50.003479 MiB name: msgpool_2598271 00:05:36.249 size: 21.763794 MiB name: PDU_Pool 00:05:36.249 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:36.249 size: 0.026123 MiB name: Session_Pool 00:05:36.249 end mempools------- 00:05:36.249 6 memzones totaling size 4.142822 MiB 00:05:36.249 size: 1.000366 MiB name: RG_ring_0_2598271 00:05:36.249 size: 1.000366 MiB name: RG_ring_1_2598271 00:05:36.249 size: 1.000366 MiB name: RG_ring_4_2598271 00:05:36.249 size: 1.000366 MiB name: RG_ring_5_2598271 00:05:36.249 size: 0.125366 MiB name: RG_ring_2_2598271 00:05:36.249 size: 0.015991 MiB name: RG_ring_3_2598271 00:05:36.249 end memzones------- 00:05:36.249 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:36.249 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:36.249 list of free elements. size: 12.519348 MiB 00:05:36.249 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:36.249 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:36.249 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:36.249 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:36.249 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:36.249 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:36.249 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:36.249 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:36.249 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:36.249 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:36.249 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:36.249 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:36.249 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:36.249 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:36.249 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:36.249 list of standard malloc elements. size: 199.218079 MiB 00:05:36.249 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:36.249 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:36.249 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:36.249 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:36.249 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:36.249 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:36.249 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:36.249 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:36.249 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:36.249 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:36.249 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:36.249 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:36.249 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:36.249 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:36.249 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:36.249 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:36.249 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:36.249 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:36.249 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:36.249 list of memzone associated elements. size: 602.262573 MiB 00:05:36.249 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:36.249 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:36.249 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:36.249 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:36.249 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:36.249 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2598271_0 00:05:36.249 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:36.249 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2598271_0 00:05:36.249 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:36.249 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2598271_0 00:05:36.249 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:36.249 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:36.249 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:36.249 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:36.249 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:36.249 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2598271 00:05:36.249 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:36.249 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2598271 00:05:36.249 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:36.249 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2598271 00:05:36.249 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:36.249 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:36.249 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:36.249 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:36.249 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:36.249 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:36.249 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:36.250 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:36.250 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:36.250 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2598271 00:05:36.250 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:36.250 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2598271 00:05:36.250 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:36.250 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2598271 00:05:36.250 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:36.250 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2598271 00:05:36.250 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:36.250 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2598271 00:05:36.250 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:36.250 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:36.250 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:36.250 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:36.250 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:36.250 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:36.250 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:36.250 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2598271 00:05:36.250 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:36.250 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:36.250 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:36.250 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:36.250 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:36.250 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2598271 00:05:36.250 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:36.250 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:36.250 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:36.250 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2598271 00:05:36.250 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:36.250 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2598271 00:05:36.250 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:36.250 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:36.250 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:36.250 06:48:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2598271 00:05:36.250 06:48:05 -- common/autotest_common.sh@926 -- # '[' -z 2598271 ']' 00:05:36.250 06:48:05 -- common/autotest_common.sh@930 -- # kill -0 2598271 00:05:36.250 06:48:05 -- common/autotest_common.sh@931 -- # uname 00:05:36.250 06:48:05 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:36.250 06:48:05 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2598271 00:05:36.250 06:48:06 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:36.250 06:48:06 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:36.250 06:48:06 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2598271' 00:05:36.250 killing process with pid 2598271 00:05:36.250 06:48:06 -- common/autotest_common.sh@945 -- # kill 2598271 00:05:36.250 06:48:06 -- common/autotest_common.sh@950 -- # wait 2598271 00:05:36.509 00:05:36.509 real 0m1.387s 00:05:36.509 user 0m1.441s 00:05:36.509 sys 0m0.419s 00:05:36.509 06:48:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.509 06:48:06 -- common/autotest_common.sh@10 -- # set +x 00:05:36.509 ************************************ 00:05:36.509 END TEST dpdk_mem_utility 00:05:36.509 ************************************ 00:05:36.509 06:48:06 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:36.509 06:48:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.509 06:48:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.509 06:48:06 -- common/autotest_common.sh@10 -- # set +x 00:05:36.509 ************************************ 00:05:36.509 START TEST event 00:05:36.509 ************************************ 00:05:36.509 06:48:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:36.768 * Looking for test storage... 00:05:36.769 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:36.769 06:48:06 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:36.769 06:48:06 -- bdev/nbd_common.sh@6 -- # set -e 00:05:36.769 06:48:06 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:36.769 06:48:06 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:36.769 06:48:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.769 06:48:06 -- common/autotest_common.sh@10 -- # set +x 00:05:36.769 ************************************ 00:05:36.769 START TEST event_perf 00:05:36.769 ************************************ 00:05:36.769 06:48:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:36.769 Running I/O for 1 seconds...[2024-04-27 06:48:06.443913] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:36.769 [2024-04-27 06:48:06.444020] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2598688 ] 00:05:36.769 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.769 [2024-04-27 06:48:06.516348] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:36.769 [2024-04-27 06:48:06.555709] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:36.769 [2024-04-27 06:48:06.555805] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:36.769 [2024-04-27 06:48:06.555891] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:36.769 [2024-04-27 06:48:06.555892] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.148 Running I/O for 1 seconds... 00:05:38.148 lcore 0: 191648 00:05:38.148 lcore 1: 191648 00:05:38.148 lcore 2: 191649 00:05:38.148 lcore 3: 191651 00:05:38.148 done. 00:05:38.148 00:05:38.148 real 0m1.184s 00:05:38.148 user 0m4.085s 00:05:38.148 sys 0m0.098s 00:05:38.148 06:48:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:38.148 06:48:07 -- common/autotest_common.sh@10 -- # set +x 00:05:38.148 ************************************ 00:05:38.148 END TEST event_perf 00:05:38.148 ************************************ 00:05:38.148 06:48:07 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:38.148 06:48:07 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:38.148 06:48:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:38.148 06:48:07 -- common/autotest_common.sh@10 -- # set +x 00:05:38.148 ************************************ 00:05:38.148 START TEST event_reactor 00:05:38.148 ************************************ 00:05:38.148 06:48:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:38.148 [2024-04-27 06:48:07.672654] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:38.148 [2024-04-27 06:48:07.672791] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2598870 ] 00:05:38.148 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.148 [2024-04-27 06:48:07.745182] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.148 [2024-04-27 06:48:07.780172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.087 test_start 00:05:39.087 oneshot 00:05:39.087 tick 100 00:05:39.087 tick 100 00:05:39.087 tick 250 00:05:39.087 tick 100 00:05:39.087 tick 100 00:05:39.087 tick 100 00:05:39.087 tick 250 00:05:39.087 tick 500 00:05:39.087 tick 100 00:05:39.087 tick 100 00:05:39.087 tick 250 00:05:39.087 tick 100 00:05:39.087 tick 100 00:05:39.087 test_end 00:05:39.087 00:05:39.087 real 0m1.178s 00:05:39.087 user 0m1.084s 00:05:39.087 sys 0m0.089s 00:05:39.087 06:48:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:39.087 06:48:08 -- common/autotest_common.sh@10 -- # set +x 00:05:39.087 ************************************ 00:05:39.087 END TEST event_reactor 00:05:39.087 ************************************ 00:05:39.087 06:48:08 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:39.087 06:48:08 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:39.087 06:48:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:39.087 06:48:08 -- common/autotest_common.sh@10 -- # set +x 00:05:39.087 ************************************ 00:05:39.087 START TEST event_reactor_perf 00:05:39.087 ************************************ 00:05:39.087 06:48:08 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:39.087 [2024-04-27 06:48:08.894132] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:39.087 [2024-04-27 06:48:08.894237] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2599154 ] 00:05:39.087 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.087 [2024-04-27 06:48:08.964465] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:39.347 [2024-04-27 06:48:08.999147] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.285 test_start 00:05:40.285 test_end 00:05:40.285 Performance: 907586 events per second 00:05:40.285 00:05:40.285 real 0m1.175s 00:05:40.285 user 0m1.083s 00:05:40.285 sys 0m0.087s 00:05:40.285 06:48:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:40.285 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.285 ************************************ 00:05:40.285 END TEST event_reactor_perf 00:05:40.285 ************************************ 00:05:40.285 06:48:10 -- event/event.sh@49 -- # uname -s 00:05:40.285 06:48:10 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:40.286 06:48:10 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:40.286 06:48:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:40.286 06:48:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:40.286 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.286 ************************************ 00:05:40.286 START TEST event_scheduler 00:05:40.286 ************************************ 00:05:40.286 06:48:10 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:40.545 * Looking for test storage... 00:05:40.545 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:40.545 06:48:10 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:40.545 06:48:10 -- scheduler/scheduler.sh@35 -- # scheduler_pid=2599464 00:05:40.545 06:48:10 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:40.545 06:48:10 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:40.545 06:48:10 -- scheduler/scheduler.sh@37 -- # waitforlisten 2599464 00:05:40.545 06:48:10 -- common/autotest_common.sh@819 -- # '[' -z 2599464 ']' 00:05:40.545 06:48:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:40.545 06:48:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:40.545 06:48:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:40.545 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:40.545 06:48:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:40.545 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.545 [2024-04-27 06:48:10.219604] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:40.545 [2024-04-27 06:48:10.219694] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2599464 ] 00:05:40.545 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.545 [2024-04-27 06:48:10.289362] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:40.545 [2024-04-27 06:48:10.328409] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.545 [2024-04-27 06:48:10.328459] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:40.545 [2024-04-27 06:48:10.328552] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:40.545 [2024-04-27 06:48:10.328555] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:40.545 06:48:10 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:40.545 06:48:10 -- common/autotest_common.sh@852 -- # return 0 00:05:40.545 06:48:10 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:40.545 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.545 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.545 POWER: Env isn't set yet! 00:05:40.545 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:40.546 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:40.546 POWER: Cannot set governor of lcore 0 to userspace 00:05:40.546 POWER: Attempting to initialise PSTAT power management... 00:05:40.546 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:40.546 POWER: Initialized successfully for lcore 0 power management 00:05:40.546 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:40.546 POWER: Initialized successfully for lcore 1 power management 00:05:40.546 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:40.546 POWER: Initialized successfully for lcore 2 power management 00:05:40.546 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:40.546 POWER: Initialized successfully for lcore 3 power management 00:05:40.546 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.546 06:48:10 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:40.546 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.546 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 [2024-04-27 06:48:10.497658] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:40.806 06:48:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:40.806 06:48:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 ************************************ 00:05:40.806 START TEST scheduler_create_thread 00:05:40.806 ************************************ 00:05:40.806 06:48:10 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 2 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 3 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 4 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 5 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 6 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 7 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 8 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 9 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.806 10 00:05:40.806 06:48:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.806 06:48:10 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:40.806 06:48:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.806 06:48:10 -- common/autotest_common.sh@10 -- # set +x 00:05:41.375 06:48:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:41.375 06:48:11 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:41.375 06:48:11 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:41.375 06:48:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:41.375 06:48:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.313 06:48:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.313 06:48:11 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:42.313 06:48:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.313 06:48:11 -- common/autotest_common.sh@10 -- # set +x 00:05:43.250 06:48:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:43.250 06:48:12 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:43.250 06:48:12 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:43.250 06:48:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:43.250 06:48:12 -- common/autotest_common.sh@10 -- # set +x 00:05:44.188 06:48:13 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:44.188 00:05:44.188 real 0m3.232s 00:05:44.188 user 0m0.020s 00:05:44.188 sys 0m0.011s 00:05:44.188 06:48:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:44.188 06:48:13 -- common/autotest_common.sh@10 -- # set +x 00:05:44.188 ************************************ 00:05:44.188 END TEST scheduler_create_thread 00:05:44.188 ************************************ 00:05:44.188 06:48:13 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:44.188 06:48:13 -- scheduler/scheduler.sh@46 -- # killprocess 2599464 00:05:44.188 06:48:13 -- common/autotest_common.sh@926 -- # '[' -z 2599464 ']' 00:05:44.188 06:48:13 -- common/autotest_common.sh@930 -- # kill -0 2599464 00:05:44.188 06:48:13 -- common/autotest_common.sh@931 -- # uname 00:05:44.188 06:48:13 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:44.188 06:48:13 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2599464 00:05:44.188 06:48:13 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:44.188 06:48:13 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:44.188 06:48:13 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2599464' 00:05:44.188 killing process with pid 2599464 00:05:44.188 06:48:13 -- common/autotest_common.sh@945 -- # kill 2599464 00:05:44.188 06:48:13 -- common/autotest_common.sh@950 -- # wait 2599464 00:05:44.448 [2024-04-27 06:48:14.118645] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:44.448 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:44.448 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:44.448 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:44.448 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:44.448 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:44.448 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:44.448 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:44.448 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:44.708 00:05:44.708 real 0m4.247s 00:05:44.708 user 0m7.426s 00:05:44.708 sys 0m0.352s 00:05:44.708 06:48:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:44.708 06:48:14 -- common/autotest_common.sh@10 -- # set +x 00:05:44.708 ************************************ 00:05:44.708 END TEST event_scheduler 00:05:44.708 ************************************ 00:05:44.708 06:48:14 -- event/event.sh@51 -- # modprobe -n nbd 00:05:44.708 06:48:14 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:44.708 06:48:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:44.708 06:48:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:44.708 06:48:14 -- common/autotest_common.sh@10 -- # set +x 00:05:44.708 ************************************ 00:05:44.708 START TEST app_repeat 00:05:44.708 ************************************ 00:05:44.708 06:48:14 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:05:44.708 06:48:14 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:44.708 06:48:14 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:44.708 06:48:14 -- event/event.sh@13 -- # local nbd_list 00:05:44.708 06:48:14 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:44.708 06:48:14 -- event/event.sh@14 -- # local bdev_list 00:05:44.708 06:48:14 -- event/event.sh@15 -- # local repeat_times=4 00:05:44.708 06:48:14 -- event/event.sh@17 -- # modprobe nbd 00:05:44.708 06:48:14 -- event/event.sh@19 -- # repeat_pid=2600312 00:05:44.708 06:48:14 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:44.708 06:48:14 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:44.708 06:48:14 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2600312' 00:05:44.708 Process app_repeat pid: 2600312 00:05:44.708 06:48:14 -- event/event.sh@23 -- # for i in {0..2} 00:05:44.708 06:48:14 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:44.708 spdk_app_start Round 0 00:05:44.708 06:48:14 -- event/event.sh@25 -- # waitforlisten 2600312 /var/tmp/spdk-nbd.sock 00:05:44.708 06:48:14 -- common/autotest_common.sh@819 -- # '[' -z 2600312 ']' 00:05:44.708 06:48:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:44.708 06:48:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:44.708 06:48:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:44.708 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:44.708 06:48:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:44.708 06:48:14 -- common/autotest_common.sh@10 -- # set +x 00:05:44.708 [2024-04-27 06:48:14.439679] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:44.708 [2024-04-27 06:48:14.439791] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2600312 ] 00:05:44.708 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.708 [2024-04-27 06:48:14.510715] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:44.708 [2024-04-27 06:48:14.548584] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:44.708 [2024-04-27 06:48:14.548586] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.646 06:48:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:45.646 06:48:15 -- common/autotest_common.sh@852 -- # return 0 00:05:45.646 06:48:15 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:45.646 Malloc0 00:05:45.646 06:48:15 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:45.905 Malloc1 00:05:45.905 06:48:15 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@12 -- # local i 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:45.905 06:48:15 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:46.165 /dev/nbd0 00:05:46.165 06:48:15 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:46.165 06:48:15 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:46.165 06:48:15 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:46.165 06:48:15 -- common/autotest_common.sh@857 -- # local i 00:05:46.165 06:48:15 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:46.165 06:48:15 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:46.165 06:48:15 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:46.165 06:48:15 -- common/autotest_common.sh@861 -- # break 00:05:46.165 06:48:15 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:46.165 06:48:15 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:46.165 06:48:15 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:46.165 1+0 records in 00:05:46.165 1+0 records out 00:05:46.165 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024295 s, 16.9 MB/s 00:05:46.165 06:48:15 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:46.165 06:48:15 -- common/autotest_common.sh@874 -- # size=4096 00:05:46.165 06:48:15 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:46.165 06:48:15 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:46.165 06:48:15 -- common/autotest_common.sh@877 -- # return 0 00:05:46.165 06:48:15 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:46.165 06:48:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:46.165 06:48:15 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:46.165 /dev/nbd1 00:05:46.165 06:48:16 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:46.165 06:48:16 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:46.165 06:48:16 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:46.165 06:48:16 -- common/autotest_common.sh@857 -- # local i 00:05:46.165 06:48:16 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:46.165 06:48:16 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:46.165 06:48:16 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:46.165 06:48:16 -- common/autotest_common.sh@861 -- # break 00:05:46.165 06:48:16 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:46.165 06:48:16 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:46.165 06:48:16 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:46.165 1+0 records in 00:05:46.165 1+0 records out 00:05:46.165 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000114524 s, 35.8 MB/s 00:05:46.165 06:48:16 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:46.165 06:48:16 -- common/autotest_common.sh@874 -- # size=4096 00:05:46.165 06:48:16 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:46.165 06:48:16 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:46.165 06:48:16 -- common/autotest_common.sh@877 -- # return 0 00:05:46.165 06:48:16 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:46.165 06:48:16 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:46.165 06:48:16 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:46.165 06:48:16 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:46.165 06:48:16 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:46.425 { 00:05:46.425 "nbd_device": "/dev/nbd0", 00:05:46.425 "bdev_name": "Malloc0" 00:05:46.425 }, 00:05:46.425 { 00:05:46.425 "nbd_device": "/dev/nbd1", 00:05:46.425 "bdev_name": "Malloc1" 00:05:46.425 } 00:05:46.425 ]' 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:46.425 { 00:05:46.425 "nbd_device": "/dev/nbd0", 00:05:46.425 "bdev_name": "Malloc0" 00:05:46.425 }, 00:05:46.425 { 00:05:46.425 "nbd_device": "/dev/nbd1", 00:05:46.425 "bdev_name": "Malloc1" 00:05:46.425 } 00:05:46.425 ]' 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:46.425 /dev/nbd1' 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:46.425 /dev/nbd1' 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@65 -- # count=2 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@95 -- # count=2 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:46.425 256+0 records in 00:05:46.425 256+0 records out 00:05:46.425 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110761 s, 94.7 MB/s 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:46.425 256+0 records in 00:05:46.425 256+0 records out 00:05:46.425 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200149 s, 52.4 MB/s 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:46.425 06:48:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:46.685 256+0 records in 00:05:46.685 256+0 records out 00:05:46.685 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211766 s, 49.5 MB/s 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@51 -- # local i 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@41 -- # break 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@45 -- # return 0 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:46.685 06:48:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@41 -- # break 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@45 -- # return 0 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:46.944 06:48:16 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@65 -- # true 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@65 -- # count=0 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@104 -- # count=0 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:47.203 06:48:16 -- bdev/nbd_common.sh@109 -- # return 0 00:05:47.203 06:48:16 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:47.463 06:48:17 -- event/event.sh@35 -- # sleep 3 00:05:47.463 [2024-04-27 06:48:17.312185] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:47.463 [2024-04-27 06:48:17.344310] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:47.463 [2024-04-27 06:48:17.344313] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.722 [2024-04-27 06:48:17.383838] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:47.722 [2024-04-27 06:48:17.383879] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:50.258 06:48:20 -- event/event.sh@23 -- # for i in {0..2} 00:05:50.258 06:48:20 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:50.258 spdk_app_start Round 1 00:05:50.258 06:48:20 -- event/event.sh@25 -- # waitforlisten 2600312 /var/tmp/spdk-nbd.sock 00:05:50.258 06:48:20 -- common/autotest_common.sh@819 -- # '[' -z 2600312 ']' 00:05:50.258 06:48:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:50.258 06:48:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:50.258 06:48:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:50.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:50.258 06:48:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:50.258 06:48:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.518 06:48:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:50.518 06:48:20 -- common/autotest_common.sh@852 -- # return 0 00:05:50.518 06:48:20 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:50.778 Malloc0 00:05:50.778 06:48:20 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:50.778 Malloc1 00:05:51.038 06:48:20 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@12 -- # local i 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:51.038 /dev/nbd0 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:51.038 06:48:20 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:51.038 06:48:20 -- common/autotest_common.sh@857 -- # local i 00:05:51.038 06:48:20 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:51.038 06:48:20 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:51.038 06:48:20 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:51.038 06:48:20 -- common/autotest_common.sh@861 -- # break 00:05:51.038 06:48:20 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:51.038 06:48:20 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:51.038 06:48:20 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:51.038 1+0 records in 00:05:51.038 1+0 records out 00:05:51.038 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021955 s, 18.7 MB/s 00:05:51.038 06:48:20 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:51.038 06:48:20 -- common/autotest_common.sh@874 -- # size=4096 00:05:51.038 06:48:20 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:51.038 06:48:20 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:51.038 06:48:20 -- common/autotest_common.sh@877 -- # return 0 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:51.038 06:48:20 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:51.297 /dev/nbd1 00:05:51.297 06:48:21 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:51.297 06:48:21 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:51.297 06:48:21 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:51.297 06:48:21 -- common/autotest_common.sh@857 -- # local i 00:05:51.297 06:48:21 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:51.297 06:48:21 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:51.297 06:48:21 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:51.297 06:48:21 -- common/autotest_common.sh@861 -- # break 00:05:51.297 06:48:21 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:51.297 06:48:21 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:51.297 06:48:21 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:51.297 1+0 records in 00:05:51.297 1+0 records out 00:05:51.297 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240555 s, 17.0 MB/s 00:05:51.297 06:48:21 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:51.297 06:48:21 -- common/autotest_common.sh@874 -- # size=4096 00:05:51.297 06:48:21 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:51.297 06:48:21 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:51.297 06:48:21 -- common/autotest_common.sh@877 -- # return 0 00:05:51.297 06:48:21 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:51.297 06:48:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:51.297 06:48:21 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:51.297 06:48:21 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:51.297 06:48:21 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:51.557 { 00:05:51.557 "nbd_device": "/dev/nbd0", 00:05:51.557 "bdev_name": "Malloc0" 00:05:51.557 }, 00:05:51.557 { 00:05:51.557 "nbd_device": "/dev/nbd1", 00:05:51.557 "bdev_name": "Malloc1" 00:05:51.557 } 00:05:51.557 ]' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:51.557 { 00:05:51.557 "nbd_device": "/dev/nbd0", 00:05:51.557 "bdev_name": "Malloc0" 00:05:51.557 }, 00:05:51.557 { 00:05:51.557 "nbd_device": "/dev/nbd1", 00:05:51.557 "bdev_name": "Malloc1" 00:05:51.557 } 00:05:51.557 ]' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:51.557 /dev/nbd1' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:51.557 /dev/nbd1' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@65 -- # count=2 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@95 -- # count=2 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:51.557 256+0 records in 00:05:51.557 256+0 records out 00:05:51.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114184 s, 91.8 MB/s 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:51.557 256+0 records in 00:05:51.557 256+0 records out 00:05:51.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197813 s, 53.0 MB/s 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:51.557 256+0 records in 00:05:51.557 256+0 records out 00:05:51.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0208904 s, 50.2 MB/s 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@51 -- # local i 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:51.557 06:48:21 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@41 -- # break 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@45 -- # return 0 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:51.816 06:48:21 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@41 -- # break 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@45 -- # return 0 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:52.075 06:48:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:52.335 06:48:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:52.335 06:48:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:52.335 06:48:21 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:52.335 06:48:21 -- bdev/nbd_common.sh@65 -- # true 00:05:52.335 06:48:21 -- bdev/nbd_common.sh@65 -- # count=0 00:05:52.335 06:48:22 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:52.335 06:48:22 -- bdev/nbd_common.sh@104 -- # count=0 00:05:52.335 06:48:22 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:52.335 06:48:22 -- bdev/nbd_common.sh@109 -- # return 0 00:05:52.335 06:48:22 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:52.335 06:48:22 -- event/event.sh@35 -- # sleep 3 00:05:52.595 [2024-04-27 06:48:22.359962] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:52.595 [2024-04-27 06:48:22.392662] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:52.595 [2024-04-27 06:48:22.392665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.595 [2024-04-27 06:48:22.432240] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:52.595 [2024-04-27 06:48:22.432280] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:55.885 06:48:25 -- event/event.sh@23 -- # for i in {0..2} 00:05:55.885 06:48:25 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:55.885 spdk_app_start Round 2 00:05:55.885 06:48:25 -- event/event.sh@25 -- # waitforlisten 2600312 /var/tmp/spdk-nbd.sock 00:05:55.885 06:48:25 -- common/autotest_common.sh@819 -- # '[' -z 2600312 ']' 00:05:55.885 06:48:25 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:55.885 06:48:25 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:55.885 06:48:25 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:55.885 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:55.885 06:48:25 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:55.885 06:48:25 -- common/autotest_common.sh@10 -- # set +x 00:05:55.885 06:48:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:55.885 06:48:25 -- common/autotest_common.sh@852 -- # return 0 00:05:55.885 06:48:25 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:55.885 Malloc0 00:05:55.885 06:48:25 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:55.885 Malloc1 00:05:55.885 06:48:25 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@12 -- # local i 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:55.885 06:48:25 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:56.143 /dev/nbd0 00:05:56.143 06:48:25 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:56.143 06:48:25 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:56.143 06:48:25 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:56.143 06:48:25 -- common/autotest_common.sh@857 -- # local i 00:05:56.143 06:48:25 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:56.143 06:48:25 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:56.143 06:48:25 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:56.143 06:48:25 -- common/autotest_common.sh@861 -- # break 00:05:56.143 06:48:25 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:56.143 06:48:25 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:56.143 06:48:25 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:56.143 1+0 records in 00:05:56.143 1+0 records out 00:05:56.143 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000141673 s, 28.9 MB/s 00:05:56.143 06:48:25 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:56.143 06:48:25 -- common/autotest_common.sh@874 -- # size=4096 00:05:56.143 06:48:25 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:56.143 06:48:25 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:56.143 06:48:25 -- common/autotest_common.sh@877 -- # return 0 00:05:56.143 06:48:25 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:56.143 06:48:25 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:56.143 06:48:25 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:56.402 /dev/nbd1 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:56.402 06:48:26 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:56.402 06:48:26 -- common/autotest_common.sh@857 -- # local i 00:05:56.402 06:48:26 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:56.402 06:48:26 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:56.402 06:48:26 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:56.402 06:48:26 -- common/autotest_common.sh@861 -- # break 00:05:56.402 06:48:26 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:56.402 06:48:26 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:56.402 06:48:26 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:56.402 1+0 records in 00:05:56.402 1+0 records out 00:05:56.402 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248344 s, 16.5 MB/s 00:05:56.402 06:48:26 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:56.402 06:48:26 -- common/autotest_common.sh@874 -- # size=4096 00:05:56.402 06:48:26 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:56.402 06:48:26 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:56.402 06:48:26 -- common/autotest_common.sh@877 -- # return 0 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:56.402 { 00:05:56.402 "nbd_device": "/dev/nbd0", 00:05:56.402 "bdev_name": "Malloc0" 00:05:56.402 }, 00:05:56.402 { 00:05:56.402 "nbd_device": "/dev/nbd1", 00:05:56.402 "bdev_name": "Malloc1" 00:05:56.402 } 00:05:56.402 ]' 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:56.402 { 00:05:56.402 "nbd_device": "/dev/nbd0", 00:05:56.402 "bdev_name": "Malloc0" 00:05:56.402 }, 00:05:56.402 { 00:05:56.402 "nbd_device": "/dev/nbd1", 00:05:56.402 "bdev_name": "Malloc1" 00:05:56.402 } 00:05:56.402 ]' 00:05:56.402 06:48:26 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:56.660 /dev/nbd1' 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:56.660 /dev/nbd1' 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@65 -- # count=2 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@95 -- # count=2 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:56.660 256+0 records in 00:05:56.660 256+0 records out 00:05:56.660 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110881 s, 94.6 MB/s 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:56.660 256+0 records in 00:05:56.660 256+0 records out 00:05:56.660 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200921 s, 52.2 MB/s 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:56.660 256+0 records in 00:05:56.660 256+0 records out 00:05:56.660 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0219444 s, 47.8 MB/s 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@51 -- # local i 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:56.660 06:48:26 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@41 -- # break 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@45 -- # return 0 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@41 -- # break 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@45 -- # return 0 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:56.918 06:48:26 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:57.176 06:48:26 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:57.176 06:48:26 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:57.176 06:48:26 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@65 -- # true 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@65 -- # count=0 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@104 -- # count=0 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:57.176 06:48:27 -- bdev/nbd_common.sh@109 -- # return 0 00:05:57.176 06:48:27 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:57.434 06:48:27 -- event/event.sh@35 -- # sleep 3 00:05:57.692 [2024-04-27 06:48:27.382501] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:57.692 [2024-04-27 06:48:27.414985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:57.692 [2024-04-27 06:48:27.414986] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.692 [2024-04-27 06:48:27.454631] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:57.692 [2024-04-27 06:48:27.454673] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:01.079 06:48:30 -- event/event.sh@38 -- # waitforlisten 2600312 /var/tmp/spdk-nbd.sock 00:06:01.079 06:48:30 -- common/autotest_common.sh@819 -- # '[' -z 2600312 ']' 00:06:01.079 06:48:30 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:01.079 06:48:30 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:01.079 06:48:30 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:01.079 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:01.079 06:48:30 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:01.079 06:48:30 -- common/autotest_common.sh@10 -- # set +x 00:06:01.079 06:48:30 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:01.079 06:48:30 -- common/autotest_common.sh@852 -- # return 0 00:06:01.079 06:48:30 -- event/event.sh@39 -- # killprocess 2600312 00:06:01.079 06:48:30 -- common/autotest_common.sh@926 -- # '[' -z 2600312 ']' 00:06:01.079 06:48:30 -- common/autotest_common.sh@930 -- # kill -0 2600312 00:06:01.079 06:48:30 -- common/autotest_common.sh@931 -- # uname 00:06:01.079 06:48:30 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:01.079 06:48:30 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2600312 00:06:01.079 06:48:30 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:01.079 06:48:30 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:01.079 06:48:30 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2600312' 00:06:01.079 killing process with pid 2600312 00:06:01.079 06:48:30 -- common/autotest_common.sh@945 -- # kill 2600312 00:06:01.079 06:48:30 -- common/autotest_common.sh@950 -- # wait 2600312 00:06:01.079 spdk_app_start is called in Round 0. 00:06:01.079 Shutdown signal received, stop current app iteration 00:06:01.079 Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 reinitialization... 00:06:01.079 spdk_app_start is called in Round 1. 00:06:01.079 Shutdown signal received, stop current app iteration 00:06:01.079 Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 reinitialization... 00:06:01.079 spdk_app_start is called in Round 2. 00:06:01.079 Shutdown signal received, stop current app iteration 00:06:01.080 Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 reinitialization... 00:06:01.080 spdk_app_start is called in Round 3. 00:06:01.080 Shutdown signal received, stop current app iteration 00:06:01.080 06:48:30 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:01.080 06:48:30 -- event/event.sh@42 -- # return 0 00:06:01.080 00:06:01.080 real 0m16.174s 00:06:01.080 user 0m34.451s 00:06:01.080 sys 0m3.049s 00:06:01.080 06:48:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.080 06:48:30 -- common/autotest_common.sh@10 -- # set +x 00:06:01.080 ************************************ 00:06:01.080 END TEST app_repeat 00:06:01.080 ************************************ 00:06:01.080 06:48:30 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:01.080 06:48:30 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:01.080 06:48:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:01.080 06:48:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.080 06:48:30 -- common/autotest_common.sh@10 -- # set +x 00:06:01.080 ************************************ 00:06:01.080 START TEST cpu_locks 00:06:01.080 ************************************ 00:06:01.080 06:48:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:01.080 * Looking for test storage... 00:06:01.080 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:01.080 06:48:30 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:01.080 06:48:30 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:01.080 06:48:30 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:01.080 06:48:30 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:01.080 06:48:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:01.080 06:48:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.080 06:48:30 -- common/autotest_common.sh@10 -- # set +x 00:06:01.080 ************************************ 00:06:01.080 START TEST default_locks 00:06:01.080 ************************************ 00:06:01.080 06:48:30 -- common/autotest_common.sh@1104 -- # default_locks 00:06:01.080 06:48:30 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=2603266 00:06:01.080 06:48:30 -- event/cpu_locks.sh@47 -- # waitforlisten 2603266 00:06:01.080 06:48:30 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:01.080 06:48:30 -- common/autotest_common.sh@819 -- # '[' -z 2603266 ']' 00:06:01.080 06:48:30 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:01.080 06:48:30 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:01.080 06:48:30 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:01.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:01.080 06:48:30 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:01.080 06:48:30 -- common/autotest_common.sh@10 -- # set +x 00:06:01.080 [2024-04-27 06:48:30.762585] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:01.080 [2024-04-27 06:48:30.762682] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2603266 ] 00:06:01.080 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.080 [2024-04-27 06:48:30.832107] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.080 [2024-04-27 06:48:30.868152] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:01.080 [2024-04-27 06:48:30.868273] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.018 06:48:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:02.018 06:48:31 -- common/autotest_common.sh@852 -- # return 0 00:06:02.018 06:48:31 -- event/cpu_locks.sh@49 -- # locks_exist 2603266 00:06:02.018 06:48:31 -- event/cpu_locks.sh@22 -- # lslocks -p 2603266 00:06:02.018 06:48:31 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:02.277 lslocks: write error 00:06:02.277 06:48:32 -- event/cpu_locks.sh@50 -- # killprocess 2603266 00:06:02.277 06:48:32 -- common/autotest_common.sh@926 -- # '[' -z 2603266 ']' 00:06:02.277 06:48:32 -- common/autotest_common.sh@930 -- # kill -0 2603266 00:06:02.277 06:48:32 -- common/autotest_common.sh@931 -- # uname 00:06:02.277 06:48:32 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:02.277 06:48:32 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2603266 00:06:02.277 06:48:32 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:02.277 06:48:32 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:02.278 06:48:32 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2603266' 00:06:02.278 killing process with pid 2603266 00:06:02.278 06:48:32 -- common/autotest_common.sh@945 -- # kill 2603266 00:06:02.278 06:48:32 -- common/autotest_common.sh@950 -- # wait 2603266 00:06:02.537 06:48:32 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 2603266 00:06:02.537 06:48:32 -- common/autotest_common.sh@640 -- # local es=0 00:06:02.537 06:48:32 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2603266 00:06:02.537 06:48:32 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:02.537 06:48:32 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:02.537 06:48:32 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:02.537 06:48:32 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:02.537 06:48:32 -- common/autotest_common.sh@643 -- # waitforlisten 2603266 00:06:02.537 06:48:32 -- common/autotest_common.sh@819 -- # '[' -z 2603266 ']' 00:06:02.537 06:48:32 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:02.537 06:48:32 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:02.537 06:48:32 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:02.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:02.537 06:48:32 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:02.537 06:48:32 -- common/autotest_common.sh@10 -- # set +x 00:06:02.537 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2603266) - No such process 00:06:02.537 ERROR: process (pid: 2603266) is no longer running 00:06:02.537 06:48:32 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:02.537 06:48:32 -- common/autotest_common.sh@852 -- # return 1 00:06:02.537 06:48:32 -- common/autotest_common.sh@643 -- # es=1 00:06:02.537 06:48:32 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:02.537 06:48:32 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:02.537 06:48:32 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:02.537 06:48:32 -- event/cpu_locks.sh@54 -- # no_locks 00:06:02.537 06:48:32 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:02.537 06:48:32 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:02.537 06:48:32 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:02.537 00:06:02.537 real 0m1.694s 00:06:02.537 user 0m1.783s 00:06:02.537 sys 0m0.633s 00:06:02.537 06:48:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:02.797 06:48:32 -- common/autotest_common.sh@10 -- # set +x 00:06:02.797 ************************************ 00:06:02.797 END TEST default_locks 00:06:02.797 ************************************ 00:06:02.797 06:48:32 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:02.797 06:48:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:02.797 06:48:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:02.797 06:48:32 -- common/autotest_common.sh@10 -- # set +x 00:06:02.797 ************************************ 00:06:02.797 START TEST default_locks_via_rpc 00:06:02.797 ************************************ 00:06:02.797 06:48:32 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:06:02.797 06:48:32 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=2603578 00:06:02.797 06:48:32 -- event/cpu_locks.sh@63 -- # waitforlisten 2603578 00:06:02.797 06:48:32 -- common/autotest_common.sh@819 -- # '[' -z 2603578 ']' 00:06:02.797 06:48:32 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:02.797 06:48:32 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:02.797 06:48:32 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:02.797 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:02.797 06:48:32 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:02.797 06:48:32 -- common/autotest_common.sh@10 -- # set +x 00:06:02.797 06:48:32 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:02.797 [2024-04-27 06:48:32.502333] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:02.797 [2024-04-27 06:48:32.502406] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2603578 ] 00:06:02.797 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.797 [2024-04-27 06:48:32.569819] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.797 [2024-04-27 06:48:32.606925] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:02.797 [2024-04-27 06:48:32.607044] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.734 06:48:33 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:03.734 06:48:33 -- common/autotest_common.sh@852 -- # return 0 00:06:03.734 06:48:33 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:03.734 06:48:33 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:03.734 06:48:33 -- common/autotest_common.sh@10 -- # set +x 00:06:03.734 06:48:33 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:03.734 06:48:33 -- event/cpu_locks.sh@67 -- # no_locks 00:06:03.734 06:48:33 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:03.734 06:48:33 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:03.734 06:48:33 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:03.734 06:48:33 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:03.734 06:48:33 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:03.734 06:48:33 -- common/autotest_common.sh@10 -- # set +x 00:06:03.734 06:48:33 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:03.734 06:48:33 -- event/cpu_locks.sh@71 -- # locks_exist 2603578 00:06:03.734 06:48:33 -- event/cpu_locks.sh@22 -- # lslocks -p 2603578 00:06:03.734 06:48:33 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:03.734 06:48:33 -- event/cpu_locks.sh@73 -- # killprocess 2603578 00:06:03.734 06:48:33 -- common/autotest_common.sh@926 -- # '[' -z 2603578 ']' 00:06:03.734 06:48:33 -- common/autotest_common.sh@930 -- # kill -0 2603578 00:06:03.734 06:48:33 -- common/autotest_common.sh@931 -- # uname 00:06:03.734 06:48:33 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:03.734 06:48:33 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2603578 00:06:03.734 06:48:33 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:03.734 06:48:33 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:03.734 06:48:33 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2603578' 00:06:03.734 killing process with pid 2603578 00:06:03.734 06:48:33 -- common/autotest_common.sh@945 -- # kill 2603578 00:06:03.734 06:48:33 -- common/autotest_common.sh@950 -- # wait 2603578 00:06:04.303 00:06:04.303 real 0m1.416s 00:06:04.303 user 0m1.443s 00:06:04.303 sys 0m0.492s 00:06:04.303 06:48:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.303 06:48:33 -- common/autotest_common.sh@10 -- # set +x 00:06:04.303 ************************************ 00:06:04.303 END TEST default_locks_via_rpc 00:06:04.303 ************************************ 00:06:04.303 06:48:33 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:04.303 06:48:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:04.303 06:48:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:04.303 06:48:33 -- common/autotest_common.sh@10 -- # set +x 00:06:04.303 ************************************ 00:06:04.303 START TEST non_locking_app_on_locked_coremask 00:06:04.303 ************************************ 00:06:04.303 06:48:33 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:06:04.303 06:48:33 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=2603863 00:06:04.303 06:48:33 -- event/cpu_locks.sh@81 -- # waitforlisten 2603863 /var/tmp/spdk.sock 00:06:04.303 06:48:33 -- common/autotest_common.sh@819 -- # '[' -z 2603863 ']' 00:06:04.303 06:48:33 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:04.303 06:48:33 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:04.303 06:48:33 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:04.303 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:04.303 06:48:33 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:04.303 06:48:33 -- common/autotest_common.sh@10 -- # set +x 00:06:04.303 06:48:33 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:04.303 [2024-04-27 06:48:33.961944] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:04.303 [2024-04-27 06:48:33.962027] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2603863 ] 00:06:04.303 EAL: No free 2048 kB hugepages reported on node 1 00:06:04.303 [2024-04-27 06:48:34.032372] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.303 [2024-04-27 06:48:34.069569] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:04.303 [2024-04-27 06:48:34.069685] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.241 06:48:34 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:05.241 06:48:34 -- common/autotest_common.sh@852 -- # return 0 00:06:05.241 06:48:34 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=2604127 00:06:05.241 06:48:34 -- event/cpu_locks.sh@85 -- # waitforlisten 2604127 /var/tmp/spdk2.sock 00:06:05.241 06:48:34 -- common/autotest_common.sh@819 -- # '[' -z 2604127 ']' 00:06:05.241 06:48:34 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:05.241 06:48:34 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:05.241 06:48:34 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:05.241 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:05.241 06:48:34 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:05.241 06:48:34 -- common/autotest_common.sh@10 -- # set +x 00:06:05.241 06:48:34 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:05.241 [2024-04-27 06:48:34.795694] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:05.241 [2024-04-27 06:48:34.795785] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2604127 ] 00:06:05.241 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.241 [2024-04-27 06:48:34.886496] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:05.241 [2024-04-27 06:48:34.886518] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.241 [2024-04-27 06:48:34.954507] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:05.241 [2024-04-27 06:48:34.954620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.811 06:48:35 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:05.811 06:48:35 -- common/autotest_common.sh@852 -- # return 0 00:06:05.811 06:48:35 -- event/cpu_locks.sh@87 -- # locks_exist 2603863 00:06:05.811 06:48:35 -- event/cpu_locks.sh@22 -- # lslocks -p 2603863 00:06:05.811 06:48:35 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:06.776 lslocks: write error 00:06:06.776 06:48:36 -- event/cpu_locks.sh@89 -- # killprocess 2603863 00:06:06.776 06:48:36 -- common/autotest_common.sh@926 -- # '[' -z 2603863 ']' 00:06:06.776 06:48:36 -- common/autotest_common.sh@930 -- # kill -0 2603863 00:06:06.776 06:48:36 -- common/autotest_common.sh@931 -- # uname 00:06:06.776 06:48:36 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:06.776 06:48:36 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2603863 00:06:06.776 06:48:36 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:06.776 06:48:36 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:06.776 06:48:36 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2603863' 00:06:06.776 killing process with pid 2603863 00:06:06.776 06:48:36 -- common/autotest_common.sh@945 -- # kill 2603863 00:06:06.776 06:48:36 -- common/autotest_common.sh@950 -- # wait 2603863 00:06:07.344 06:48:37 -- event/cpu_locks.sh@90 -- # killprocess 2604127 00:06:07.344 06:48:37 -- common/autotest_common.sh@926 -- # '[' -z 2604127 ']' 00:06:07.344 06:48:37 -- common/autotest_common.sh@930 -- # kill -0 2604127 00:06:07.344 06:48:37 -- common/autotest_common.sh@931 -- # uname 00:06:07.344 06:48:37 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:07.344 06:48:37 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2604127 00:06:07.603 06:48:37 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:07.603 06:48:37 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:07.603 06:48:37 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2604127' 00:06:07.603 killing process with pid 2604127 00:06:07.603 06:48:37 -- common/autotest_common.sh@945 -- # kill 2604127 00:06:07.603 06:48:37 -- common/autotest_common.sh@950 -- # wait 2604127 00:06:07.862 00:06:07.862 real 0m3.591s 00:06:07.862 user 0m3.825s 00:06:07.862 sys 0m1.180s 00:06:07.862 06:48:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:07.862 06:48:37 -- common/autotest_common.sh@10 -- # set +x 00:06:07.862 ************************************ 00:06:07.862 END TEST non_locking_app_on_locked_coremask 00:06:07.862 ************************************ 00:06:07.862 06:48:37 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:07.862 06:48:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:07.862 06:48:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:07.862 06:48:37 -- common/autotest_common.sh@10 -- # set +x 00:06:07.862 ************************************ 00:06:07.862 START TEST locking_app_on_unlocked_coremask 00:06:07.862 ************************************ 00:06:07.862 06:48:37 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:06:07.862 06:48:37 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=2604699 00:06:07.862 06:48:37 -- event/cpu_locks.sh@99 -- # waitforlisten 2604699 /var/tmp/spdk.sock 00:06:07.862 06:48:37 -- common/autotest_common.sh@819 -- # '[' -z 2604699 ']' 00:06:07.862 06:48:37 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:07.862 06:48:37 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:07.862 06:48:37 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:07.862 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:07.862 06:48:37 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:07.862 06:48:37 -- common/autotest_common.sh@10 -- # set +x 00:06:07.862 06:48:37 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:07.862 [2024-04-27 06:48:37.594542] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:07.862 [2024-04-27 06:48:37.594634] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2604699 ] 00:06:07.862 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.862 [2024-04-27 06:48:37.662770] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:07.862 [2024-04-27 06:48:37.662795] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.862 [2024-04-27 06:48:37.700206] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:07.862 [2024-04-27 06:48:37.700321] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.799 06:48:38 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:08.799 06:48:38 -- common/autotest_common.sh@852 -- # return 0 00:06:08.799 06:48:38 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=2604715 00:06:08.799 06:48:38 -- event/cpu_locks.sh@103 -- # waitforlisten 2604715 /var/tmp/spdk2.sock 00:06:08.799 06:48:38 -- common/autotest_common.sh@819 -- # '[' -z 2604715 ']' 00:06:08.799 06:48:38 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:08.799 06:48:38 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:08.799 06:48:38 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:08.799 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:08.799 06:48:38 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:08.799 06:48:38 -- common/autotest_common.sh@10 -- # set +x 00:06:08.799 06:48:38 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:08.799 [2024-04-27 06:48:38.425184] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:08.799 [2024-04-27 06:48:38.425271] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2604715 ] 00:06:08.799 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.799 [2024-04-27 06:48:38.517843] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.799 [2024-04-27 06:48:38.594231] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:08.799 [2024-04-27 06:48:38.594348] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.368 06:48:39 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:09.368 06:48:39 -- common/autotest_common.sh@852 -- # return 0 00:06:09.368 06:48:39 -- event/cpu_locks.sh@105 -- # locks_exist 2604715 00:06:09.368 06:48:39 -- event/cpu_locks.sh@22 -- # lslocks -p 2604715 00:06:09.368 06:48:39 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:10.771 lslocks: write error 00:06:10.771 06:48:40 -- event/cpu_locks.sh@107 -- # killprocess 2604699 00:06:10.771 06:48:40 -- common/autotest_common.sh@926 -- # '[' -z 2604699 ']' 00:06:10.771 06:48:40 -- common/autotest_common.sh@930 -- # kill -0 2604699 00:06:10.771 06:48:40 -- common/autotest_common.sh@931 -- # uname 00:06:10.771 06:48:40 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:10.771 06:48:40 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2604699 00:06:10.771 06:48:40 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:10.771 06:48:40 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:10.771 06:48:40 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2604699' 00:06:10.771 killing process with pid 2604699 00:06:10.771 06:48:40 -- common/autotest_common.sh@945 -- # kill 2604699 00:06:10.771 06:48:40 -- common/autotest_common.sh@950 -- # wait 2604699 00:06:11.338 06:48:40 -- event/cpu_locks.sh@108 -- # killprocess 2604715 00:06:11.338 06:48:40 -- common/autotest_common.sh@926 -- # '[' -z 2604715 ']' 00:06:11.338 06:48:40 -- common/autotest_common.sh@930 -- # kill -0 2604715 00:06:11.338 06:48:40 -- common/autotest_common.sh@931 -- # uname 00:06:11.338 06:48:40 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:11.338 06:48:40 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2604715 00:06:11.338 06:48:40 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:11.338 06:48:40 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:11.338 06:48:40 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2604715' 00:06:11.338 killing process with pid 2604715 00:06:11.338 06:48:40 -- common/autotest_common.sh@945 -- # kill 2604715 00:06:11.338 06:48:40 -- common/autotest_common.sh@950 -- # wait 2604715 00:06:11.597 00:06:11.597 real 0m3.710s 00:06:11.597 user 0m3.948s 00:06:11.597 sys 0m1.258s 00:06:11.597 06:48:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.597 06:48:41 -- common/autotest_common.sh@10 -- # set +x 00:06:11.597 ************************************ 00:06:11.597 END TEST locking_app_on_unlocked_coremask 00:06:11.597 ************************************ 00:06:11.597 06:48:41 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:11.597 06:48:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:11.597 06:48:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.597 06:48:41 -- common/autotest_common.sh@10 -- # set +x 00:06:11.597 ************************************ 00:06:11.597 START TEST locking_app_on_locked_coremask 00:06:11.597 ************************************ 00:06:11.597 06:48:41 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:06:11.597 06:48:41 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=2605289 00:06:11.597 06:48:41 -- event/cpu_locks.sh@116 -- # waitforlisten 2605289 /var/tmp/spdk.sock 00:06:11.597 06:48:41 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:11.597 06:48:41 -- common/autotest_common.sh@819 -- # '[' -z 2605289 ']' 00:06:11.597 06:48:41 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.597 06:48:41 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:11.597 06:48:41 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.597 06:48:41 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:11.597 06:48:41 -- common/autotest_common.sh@10 -- # set +x 00:06:11.597 [2024-04-27 06:48:41.353165] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:11.597 [2024-04-27 06:48:41.353239] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2605289 ] 00:06:11.597 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.597 [2024-04-27 06:48:41.420361] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.597 [2024-04-27 06:48:41.452152] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:11.597 [2024-04-27 06:48:41.452270] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.535 06:48:42 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:12.535 06:48:42 -- common/autotest_common.sh@852 -- # return 0 00:06:12.535 06:48:42 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=2605527 00:06:12.535 06:48:42 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 2605527 /var/tmp/spdk2.sock 00:06:12.535 06:48:42 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:12.535 06:48:42 -- common/autotest_common.sh@640 -- # local es=0 00:06:12.535 06:48:42 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2605527 /var/tmp/spdk2.sock 00:06:12.535 06:48:42 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:12.535 06:48:42 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:12.535 06:48:42 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:12.535 06:48:42 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:12.535 06:48:42 -- common/autotest_common.sh@643 -- # waitforlisten 2605527 /var/tmp/spdk2.sock 00:06:12.535 06:48:42 -- common/autotest_common.sh@819 -- # '[' -z 2605527 ']' 00:06:12.535 06:48:42 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:12.535 06:48:42 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:12.535 06:48:42 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:12.535 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:12.535 06:48:42 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:12.535 06:48:42 -- common/autotest_common.sh@10 -- # set +x 00:06:12.535 [2024-04-27 06:48:42.182062] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:12.535 [2024-04-27 06:48:42.182149] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2605527 ] 00:06:12.535 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.535 [2024-04-27 06:48:42.277319] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 2605289 has claimed it. 00:06:12.535 [2024-04-27 06:48:42.277357] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:13.105 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2605527) - No such process 00:06:13.105 ERROR: process (pid: 2605527) is no longer running 00:06:13.105 06:48:42 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:13.105 06:48:42 -- common/autotest_common.sh@852 -- # return 1 00:06:13.105 06:48:42 -- common/autotest_common.sh@643 -- # es=1 00:06:13.105 06:48:42 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:13.105 06:48:42 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:13.105 06:48:42 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:13.105 06:48:42 -- event/cpu_locks.sh@122 -- # locks_exist 2605289 00:06:13.105 06:48:42 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:13.105 06:48:42 -- event/cpu_locks.sh@22 -- # lslocks -p 2605289 00:06:13.673 lslocks: write error 00:06:13.673 06:48:43 -- event/cpu_locks.sh@124 -- # killprocess 2605289 00:06:13.673 06:48:43 -- common/autotest_common.sh@926 -- # '[' -z 2605289 ']' 00:06:13.673 06:48:43 -- common/autotest_common.sh@930 -- # kill -0 2605289 00:06:13.673 06:48:43 -- common/autotest_common.sh@931 -- # uname 00:06:13.673 06:48:43 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:13.673 06:48:43 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2605289 00:06:13.673 06:48:43 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:13.673 06:48:43 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:13.673 06:48:43 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2605289' 00:06:13.673 killing process with pid 2605289 00:06:13.673 06:48:43 -- common/autotest_common.sh@945 -- # kill 2605289 00:06:13.673 06:48:43 -- common/autotest_common.sh@950 -- # wait 2605289 00:06:13.933 00:06:13.933 real 0m2.370s 00:06:13.933 user 0m2.588s 00:06:13.933 sys 0m0.730s 00:06:13.933 06:48:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:13.933 06:48:43 -- common/autotest_common.sh@10 -- # set +x 00:06:13.933 ************************************ 00:06:13.933 END TEST locking_app_on_locked_coremask 00:06:13.933 ************************************ 00:06:13.933 06:48:43 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:13.933 06:48:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:13.933 06:48:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:13.933 06:48:43 -- common/autotest_common.sh@10 -- # set +x 00:06:13.933 ************************************ 00:06:13.933 START TEST locking_overlapped_coremask 00:06:13.933 ************************************ 00:06:13.933 06:48:43 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:06:13.933 06:48:43 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=2605844 00:06:13.933 06:48:43 -- event/cpu_locks.sh@133 -- # waitforlisten 2605844 /var/tmp/spdk.sock 00:06:13.933 06:48:43 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:13.933 06:48:43 -- common/autotest_common.sh@819 -- # '[' -z 2605844 ']' 00:06:13.933 06:48:43 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:13.933 06:48:43 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:13.933 06:48:43 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:13.933 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:13.933 06:48:43 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:13.933 06:48:43 -- common/autotest_common.sh@10 -- # set +x 00:06:13.933 [2024-04-27 06:48:43.773469] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:13.933 [2024-04-27 06:48:43.773541] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2605844 ] 00:06:13.933 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.192 [2024-04-27 06:48:43.842250] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:14.192 [2024-04-27 06:48:43.881629] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:14.192 [2024-04-27 06:48:43.881792] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:14.192 [2024-04-27 06:48:43.881906] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.192 [2024-04-27 06:48:43.881906] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:14.761 06:48:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:14.761 06:48:44 -- common/autotest_common.sh@852 -- # return 0 00:06:14.761 06:48:44 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=2605873 00:06:14.761 06:48:44 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 2605873 /var/tmp/spdk2.sock 00:06:14.761 06:48:44 -- common/autotest_common.sh@640 -- # local es=0 00:06:14.761 06:48:44 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2605873 /var/tmp/spdk2.sock 00:06:14.761 06:48:44 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:14.761 06:48:44 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:14.761 06:48:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:14.761 06:48:44 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:14.761 06:48:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:14.761 06:48:44 -- common/autotest_common.sh@643 -- # waitforlisten 2605873 /var/tmp/spdk2.sock 00:06:14.761 06:48:44 -- common/autotest_common.sh@819 -- # '[' -z 2605873 ']' 00:06:14.761 06:48:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:14.761 06:48:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:14.761 06:48:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:14.761 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:14.761 06:48:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:14.761 06:48:44 -- common/autotest_common.sh@10 -- # set +x 00:06:14.761 [2024-04-27 06:48:44.628843] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:14.761 [2024-04-27 06:48:44.628931] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2605873 ] 00:06:15.020 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.020 [2024-04-27 06:48:44.724886] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2605844 has claimed it. 00:06:15.020 [2024-04-27 06:48:44.724922] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:15.588 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2605873) - No such process 00:06:15.588 ERROR: process (pid: 2605873) is no longer running 00:06:15.588 06:48:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:15.588 06:48:45 -- common/autotest_common.sh@852 -- # return 1 00:06:15.588 06:48:45 -- common/autotest_common.sh@643 -- # es=1 00:06:15.588 06:48:45 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:15.588 06:48:45 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:15.588 06:48:45 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:15.588 06:48:45 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:15.588 06:48:45 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:15.588 06:48:45 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:15.588 06:48:45 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:15.588 06:48:45 -- event/cpu_locks.sh@141 -- # killprocess 2605844 00:06:15.588 06:48:45 -- common/autotest_common.sh@926 -- # '[' -z 2605844 ']' 00:06:15.588 06:48:45 -- common/autotest_common.sh@930 -- # kill -0 2605844 00:06:15.588 06:48:45 -- common/autotest_common.sh@931 -- # uname 00:06:15.588 06:48:45 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:15.588 06:48:45 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2605844 00:06:15.588 06:48:45 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:15.588 06:48:45 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:15.588 06:48:45 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2605844' 00:06:15.588 killing process with pid 2605844 00:06:15.588 06:48:45 -- common/autotest_common.sh@945 -- # kill 2605844 00:06:15.588 06:48:45 -- common/autotest_common.sh@950 -- # wait 2605844 00:06:15.847 00:06:15.847 real 0m1.871s 00:06:15.847 user 0m5.342s 00:06:15.847 sys 0m0.466s 00:06:15.847 06:48:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.847 06:48:45 -- common/autotest_common.sh@10 -- # set +x 00:06:15.847 ************************************ 00:06:15.847 END TEST locking_overlapped_coremask 00:06:15.847 ************************************ 00:06:15.847 06:48:45 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:15.847 06:48:45 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:15.847 06:48:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:15.847 06:48:45 -- common/autotest_common.sh@10 -- # set +x 00:06:15.847 ************************************ 00:06:15.847 START TEST locking_overlapped_coremask_via_rpc 00:06:15.847 ************************************ 00:06:15.847 06:48:45 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:06:15.847 06:48:45 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=2606161 00:06:15.847 06:48:45 -- event/cpu_locks.sh@149 -- # waitforlisten 2606161 /var/tmp/spdk.sock 00:06:15.847 06:48:45 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:15.847 06:48:45 -- common/autotest_common.sh@819 -- # '[' -z 2606161 ']' 00:06:15.847 06:48:45 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:15.847 06:48:45 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:15.847 06:48:45 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:15.847 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:15.847 06:48:45 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:15.847 06:48:45 -- common/autotest_common.sh@10 -- # set +x 00:06:15.847 [2024-04-27 06:48:45.694973] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:15.847 [2024-04-27 06:48:45.695048] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2606161 ] 00:06:15.847 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.106 [2024-04-27 06:48:45.763370] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:16.106 [2024-04-27 06:48:45.763406] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:16.106 [2024-04-27 06:48:45.798662] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.106 [2024-04-27 06:48:45.798817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:16.106 [2024-04-27 06:48:45.798933] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.106 [2024-04-27 06:48:45.798935] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:16.673 06:48:46 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:16.673 06:48:46 -- common/autotest_common.sh@852 -- # return 0 00:06:16.673 06:48:46 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=2606266 00:06:16.673 06:48:46 -- event/cpu_locks.sh@153 -- # waitforlisten 2606266 /var/tmp/spdk2.sock 00:06:16.673 06:48:46 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:16.673 06:48:46 -- common/autotest_common.sh@819 -- # '[' -z 2606266 ']' 00:06:16.673 06:48:46 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:16.673 06:48:46 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:16.673 06:48:46 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:16.673 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:16.673 06:48:46 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:16.673 06:48:46 -- common/autotest_common.sh@10 -- # set +x 00:06:16.673 [2024-04-27 06:48:46.539234] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:16.673 [2024-04-27 06:48:46.539323] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2606266 ] 00:06:16.932 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.932 [2024-04-27 06:48:46.635122] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:16.932 [2024-04-27 06:48:46.635151] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:16.932 [2024-04-27 06:48:46.708897] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.932 [2024-04-27 06:48:46.709070] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:16.932 [2024-04-27 06:48:46.712439] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:16.932 [2024-04-27 06:48:46.712440] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:17.500 06:48:47 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:17.500 06:48:47 -- common/autotest_common.sh@852 -- # return 0 00:06:17.500 06:48:47 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:17.500 06:48:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:17.500 06:48:47 -- common/autotest_common.sh@10 -- # set +x 00:06:17.500 06:48:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:17.500 06:48:47 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:17.500 06:48:47 -- common/autotest_common.sh@640 -- # local es=0 00:06:17.500 06:48:47 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:17.500 06:48:47 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:06:17.500 06:48:47 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:17.500 06:48:47 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:06:17.500 06:48:47 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:17.500 06:48:47 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:17.500 06:48:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:17.500 06:48:47 -- common/autotest_common.sh@10 -- # set +x 00:06:17.500 [2024-04-27 06:48:47.381454] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2606161 has claimed it. 00:06:17.500 request: 00:06:17.500 { 00:06:17.500 "method": "framework_enable_cpumask_locks", 00:06:17.500 "req_id": 1 00:06:17.500 } 00:06:17.500 Got JSON-RPC error response 00:06:17.500 response: 00:06:17.500 { 00:06:17.500 "code": -32603, 00:06:17.500 "message": "Failed to claim CPU core: 2" 00:06:17.500 } 00:06:17.500 06:48:47 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:06:17.500 06:48:47 -- common/autotest_common.sh@643 -- # es=1 00:06:17.500 06:48:47 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:17.500 06:48:47 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:17.500 06:48:47 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:17.500 06:48:47 -- event/cpu_locks.sh@158 -- # waitforlisten 2606161 /var/tmp/spdk.sock 00:06:17.500 06:48:47 -- common/autotest_common.sh@819 -- # '[' -z 2606161 ']' 00:06:17.500 06:48:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:17.500 06:48:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:17.500 06:48:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:17.500 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:17.500 06:48:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:17.500 06:48:47 -- common/autotest_common.sh@10 -- # set +x 00:06:17.759 06:48:47 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:17.759 06:48:47 -- common/autotest_common.sh@852 -- # return 0 00:06:17.759 06:48:47 -- event/cpu_locks.sh@159 -- # waitforlisten 2606266 /var/tmp/spdk2.sock 00:06:17.759 06:48:47 -- common/autotest_common.sh@819 -- # '[' -z 2606266 ']' 00:06:17.759 06:48:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:17.759 06:48:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:17.759 06:48:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:17.759 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:17.759 06:48:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:17.759 06:48:47 -- common/autotest_common.sh@10 -- # set +x 00:06:18.019 06:48:47 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:18.019 06:48:47 -- common/autotest_common.sh@852 -- # return 0 00:06:18.019 06:48:47 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:18.019 06:48:47 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:18.019 06:48:47 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:18.019 06:48:47 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:18.019 00:06:18.019 real 0m2.077s 00:06:18.019 user 0m0.807s 00:06:18.019 sys 0m0.208s 00:06:18.019 06:48:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:18.019 06:48:47 -- common/autotest_common.sh@10 -- # set +x 00:06:18.019 ************************************ 00:06:18.019 END TEST locking_overlapped_coremask_via_rpc 00:06:18.019 ************************************ 00:06:18.019 06:48:47 -- event/cpu_locks.sh@174 -- # cleanup 00:06:18.019 06:48:47 -- event/cpu_locks.sh@15 -- # [[ -z 2606161 ]] 00:06:18.019 06:48:47 -- event/cpu_locks.sh@15 -- # killprocess 2606161 00:06:18.019 06:48:47 -- common/autotest_common.sh@926 -- # '[' -z 2606161 ']' 00:06:18.019 06:48:47 -- common/autotest_common.sh@930 -- # kill -0 2606161 00:06:18.019 06:48:47 -- common/autotest_common.sh@931 -- # uname 00:06:18.019 06:48:47 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:18.019 06:48:47 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2606161 00:06:18.019 06:48:47 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:18.019 06:48:47 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:18.019 06:48:47 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2606161' 00:06:18.019 killing process with pid 2606161 00:06:18.019 06:48:47 -- common/autotest_common.sh@945 -- # kill 2606161 00:06:18.019 06:48:47 -- common/autotest_common.sh@950 -- # wait 2606161 00:06:18.278 06:48:48 -- event/cpu_locks.sh@16 -- # [[ -z 2606266 ]] 00:06:18.278 06:48:48 -- event/cpu_locks.sh@16 -- # killprocess 2606266 00:06:18.278 06:48:48 -- common/autotest_common.sh@926 -- # '[' -z 2606266 ']' 00:06:18.278 06:48:48 -- common/autotest_common.sh@930 -- # kill -0 2606266 00:06:18.278 06:48:48 -- common/autotest_common.sh@931 -- # uname 00:06:18.278 06:48:48 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:18.278 06:48:48 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2606266 00:06:18.537 06:48:48 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:06:18.537 06:48:48 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:06:18.537 06:48:48 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2606266' 00:06:18.537 killing process with pid 2606266 00:06:18.537 06:48:48 -- common/autotest_common.sh@945 -- # kill 2606266 00:06:18.537 06:48:48 -- common/autotest_common.sh@950 -- # wait 2606266 00:06:18.795 06:48:48 -- event/cpu_locks.sh@18 -- # rm -f 00:06:18.795 06:48:48 -- event/cpu_locks.sh@1 -- # cleanup 00:06:18.795 06:48:48 -- event/cpu_locks.sh@15 -- # [[ -z 2606161 ]] 00:06:18.795 06:48:48 -- event/cpu_locks.sh@15 -- # killprocess 2606161 00:06:18.795 06:48:48 -- common/autotest_common.sh@926 -- # '[' -z 2606161 ']' 00:06:18.795 06:48:48 -- common/autotest_common.sh@930 -- # kill -0 2606161 00:06:18.795 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (2606161) - No such process 00:06:18.795 06:48:48 -- common/autotest_common.sh@953 -- # echo 'Process with pid 2606161 is not found' 00:06:18.795 Process with pid 2606161 is not found 00:06:18.795 06:48:48 -- event/cpu_locks.sh@16 -- # [[ -z 2606266 ]] 00:06:18.795 06:48:48 -- event/cpu_locks.sh@16 -- # killprocess 2606266 00:06:18.795 06:48:48 -- common/autotest_common.sh@926 -- # '[' -z 2606266 ']' 00:06:18.795 06:48:48 -- common/autotest_common.sh@930 -- # kill -0 2606266 00:06:18.795 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (2606266) - No such process 00:06:18.795 06:48:48 -- common/autotest_common.sh@953 -- # echo 'Process with pid 2606266 is not found' 00:06:18.795 Process with pid 2606266 is not found 00:06:18.795 06:48:48 -- event/cpu_locks.sh@18 -- # rm -f 00:06:18.795 00:06:18.795 real 0m17.866s 00:06:18.795 user 0m30.340s 00:06:18.795 sys 0m5.870s 00:06:18.795 06:48:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:18.795 06:48:48 -- common/autotest_common.sh@10 -- # set +x 00:06:18.795 ************************************ 00:06:18.795 END TEST cpu_locks 00:06:18.795 ************************************ 00:06:18.795 00:06:18.795 real 0m42.205s 00:06:18.795 user 1m18.578s 00:06:18.795 sys 0m9.871s 00:06:18.795 06:48:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:18.795 06:48:48 -- common/autotest_common.sh@10 -- # set +x 00:06:18.795 ************************************ 00:06:18.795 END TEST event 00:06:18.795 ************************************ 00:06:18.795 06:48:48 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:18.795 06:48:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:18.795 06:48:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:18.795 06:48:48 -- common/autotest_common.sh@10 -- # set +x 00:06:18.795 ************************************ 00:06:18.795 START TEST thread 00:06:18.795 ************************************ 00:06:18.795 06:48:48 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:18.795 * Looking for test storage... 00:06:19.055 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:19.055 06:48:48 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:19.055 06:48:48 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:19.055 06:48:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.055 06:48:48 -- common/autotest_common.sh@10 -- # set +x 00:06:19.055 ************************************ 00:06:19.055 START TEST thread_poller_perf 00:06:19.055 ************************************ 00:06:19.055 06:48:48 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:19.055 [2024-04-27 06:48:48.719816] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:19.055 [2024-04-27 06:48:48.719914] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2606801 ] 00:06:19.055 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.055 [2024-04-27 06:48:48.792536] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.055 [2024-04-27 06:48:48.829059] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.055 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:19.992 ====================================== 00:06:19.992 busy:2505332568 (cyc) 00:06:19.992 total_run_count: 811000 00:06:19.992 tsc_hz: 2500000000 (cyc) 00:06:19.992 ====================================== 00:06:19.992 poller_cost: 3089 (cyc), 1235 (nsec) 00:06:19.992 00:06:19.992 real 0m1.184s 00:06:19.992 user 0m1.090s 00:06:19.992 sys 0m0.089s 00:06:19.992 06:48:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.992 06:48:49 -- common/autotest_common.sh@10 -- # set +x 00:06:19.992 ************************************ 00:06:19.992 END TEST thread_poller_perf 00:06:19.992 ************************************ 00:06:20.251 06:48:49 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:20.251 06:48:49 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:20.251 06:48:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:20.251 06:48:49 -- common/autotest_common.sh@10 -- # set +x 00:06:20.251 ************************************ 00:06:20.251 START TEST thread_poller_perf 00:06:20.251 ************************************ 00:06:20.251 06:48:49 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:20.251 [2024-04-27 06:48:49.953298] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:20.251 [2024-04-27 06:48:49.953389] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607081 ] 00:06:20.251 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.251 [2024-04-27 06:48:50.025791] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.251 [2024-04-27 06:48:50.064841] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.251 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:21.630 ====================================== 00:06:21.630 busy:2501998174 (cyc) 00:06:21.630 total_run_count: 14322000 00:06:21.630 tsc_hz: 2500000000 (cyc) 00:06:21.630 ====================================== 00:06:21.630 poller_cost: 174 (cyc), 69 (nsec) 00:06:21.630 00:06:21.630 real 0m1.183s 00:06:21.630 user 0m1.095s 00:06:21.630 sys 0m0.085s 00:06:21.630 06:48:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.630 06:48:51 -- common/autotest_common.sh@10 -- # set +x 00:06:21.630 ************************************ 00:06:21.630 END TEST thread_poller_perf 00:06:21.630 ************************************ 00:06:21.630 06:48:51 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:21.630 06:48:51 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:21.630 06:48:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:21.630 06:48:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:21.630 06:48:51 -- common/autotest_common.sh@10 -- # set +x 00:06:21.630 ************************************ 00:06:21.630 START TEST thread_spdk_lock 00:06:21.630 ************************************ 00:06:21.630 06:48:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:21.630 [2024-04-27 06:48:51.187808] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:21.630 [2024-04-27 06:48:51.187900] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607214 ] 00:06:21.630 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.630 [2024-04-27 06:48:51.261661] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:21.630 [2024-04-27 06:48:51.298715] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.630 [2024-04-27 06:48:51.298717] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.889 [2024-04-27 06:48:51.784788] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:21.889 [2024-04-27 06:48:51.784825] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:21.889 [2024-04-27 06:48:51.784835] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x133de80 00:06:22.149 [2024-04-27 06:48:51.785676] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:22.149 [2024-04-27 06:48:51.785782] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:22.149 [2024-04-27 06:48:51.785802] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:22.149 Starting test contend 00:06:22.149 Worker Delay Wait us Hold us Total us 00:06:22.149 0 3 176779 183694 360474 00:06:22.149 1 5 87323 285660 372984 00:06:22.149 PASS test contend 00:06:22.149 Starting test hold_by_poller 00:06:22.149 PASS test hold_by_poller 00:06:22.149 Starting test hold_by_message 00:06:22.149 PASS test hold_by_message 00:06:22.149 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:22.149 100014 assertions passed 00:06:22.149 0 assertions failed 00:06:22.149 00:06:22.149 real 0m0.666s 00:06:22.149 user 0m1.056s 00:06:22.149 sys 0m0.093s 00:06:22.149 06:48:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.149 06:48:51 -- common/autotest_common.sh@10 -- # set +x 00:06:22.149 ************************************ 00:06:22.149 END TEST thread_spdk_lock 00:06:22.149 ************************************ 00:06:22.149 00:06:22.149 real 0m3.287s 00:06:22.149 user 0m3.334s 00:06:22.149 sys 0m0.464s 00:06:22.149 06:48:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.149 06:48:51 -- common/autotest_common.sh@10 -- # set +x 00:06:22.149 ************************************ 00:06:22.149 END TEST thread 00:06:22.149 ************************************ 00:06:22.149 06:48:51 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:22.149 06:48:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:22.149 06:48:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:22.149 06:48:51 -- common/autotest_common.sh@10 -- # set +x 00:06:22.149 ************************************ 00:06:22.149 START TEST accel 00:06:22.149 ************************************ 00:06:22.149 06:48:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:22.149 * Looking for test storage... 00:06:22.149 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:22.149 06:48:52 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:22.149 06:48:52 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:22.149 06:48:52 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:22.149 06:48:52 -- accel/accel.sh@59 -- # spdk_tgt_pid=2607440 00:06:22.149 06:48:52 -- accel/accel.sh@60 -- # waitforlisten 2607440 00:06:22.149 06:48:52 -- common/autotest_common.sh@819 -- # '[' -z 2607440 ']' 00:06:22.149 06:48:52 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:22.149 06:48:52 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:22.149 06:48:52 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:22.149 06:48:52 -- accel/accel.sh@58 -- # build_accel_config 00:06:22.149 06:48:52 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:22.149 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:22.149 06:48:52 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:22.149 06:48:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.149 06:48:52 -- common/autotest_common.sh@10 -- # set +x 00:06:22.149 06:48:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.149 06:48:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.149 06:48:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.149 06:48:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.149 06:48:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.149 06:48:52 -- accel/accel.sh@42 -- # jq -r . 00:06:22.409 [2024-04-27 06:48:52.055593] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:22.409 [2024-04-27 06:48:52.055680] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607440 ] 00:06:22.409 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.409 [2024-04-27 06:48:52.124209] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.409 [2024-04-27 06:48:52.160944] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:22.409 [2024-04-27 06:48:52.161062] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.980 06:48:52 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:22.980 06:48:52 -- common/autotest_common.sh@852 -- # return 0 00:06:22.980 06:48:52 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:22.980 06:48:52 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:22.980 06:48:52 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:22.980 06:48:52 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:22.980 06:48:52 -- common/autotest_common.sh@10 -- # set +x 00:06:23.239 06:48:52 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # IFS== 00:06:23.239 06:48:52 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.239 06:48:52 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.239 06:48:52 -- accel/accel.sh@67 -- # killprocess 2607440 00:06:23.239 06:48:52 -- common/autotest_common.sh@926 -- # '[' -z 2607440 ']' 00:06:23.240 06:48:52 -- common/autotest_common.sh@930 -- # kill -0 2607440 00:06:23.240 06:48:52 -- common/autotest_common.sh@931 -- # uname 00:06:23.240 06:48:52 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:23.240 06:48:52 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2607440 00:06:23.240 06:48:52 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:23.240 06:48:52 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:23.240 06:48:52 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2607440' 00:06:23.240 killing process with pid 2607440 00:06:23.240 06:48:52 -- common/autotest_common.sh@945 -- # kill 2607440 00:06:23.240 06:48:52 -- common/autotest_common.sh@950 -- # wait 2607440 00:06:23.498 06:48:53 -- accel/accel.sh@68 -- # trap - ERR 00:06:23.498 06:48:53 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:23.498 06:48:53 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:23.498 06:48:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:23.498 06:48:53 -- common/autotest_common.sh@10 -- # set +x 00:06:23.498 06:48:53 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:23.498 06:48:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:23.498 06:48:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.498 06:48:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.498 06:48:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.498 06:48:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.498 06:48:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.498 06:48:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.498 06:48:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.498 06:48:53 -- accel/accel.sh@42 -- # jq -r . 00:06:23.498 06:48:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.498 06:48:53 -- common/autotest_common.sh@10 -- # set +x 00:06:23.498 06:48:53 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:23.498 06:48:53 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:23.498 06:48:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:23.498 06:48:53 -- common/autotest_common.sh@10 -- # set +x 00:06:23.498 ************************************ 00:06:23.498 START TEST accel_missing_filename 00:06:23.498 ************************************ 00:06:23.498 06:48:53 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:23.498 06:48:53 -- common/autotest_common.sh@640 -- # local es=0 00:06:23.498 06:48:53 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:23.498 06:48:53 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:23.498 06:48:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:23.498 06:48:53 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:23.498 06:48:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:23.498 06:48:53 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:23.498 06:48:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.498 06:48:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.498 06:48:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.498 06:48:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.498 06:48:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:23.498 06:48:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.498 06:48:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.498 06:48:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.498 06:48:53 -- accel/accel.sh@42 -- # jq -r . 00:06:23.498 [2024-04-27 06:48:53.354571] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:23.498 [2024-04-27 06:48:53.354661] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607746 ] 00:06:23.498 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.758 [2024-04-27 06:48:53.423813] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.758 [2024-04-27 06:48:53.459533] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.758 [2024-04-27 06:48:53.498423] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:23.758 [2024-04-27 06:48:53.558449] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:23.758 A filename is required. 00:06:23.758 06:48:53 -- common/autotest_common.sh@643 -- # es=234 00:06:23.758 06:48:53 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:23.758 06:48:53 -- common/autotest_common.sh@652 -- # es=106 00:06:23.758 06:48:53 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:23.758 06:48:53 -- common/autotest_common.sh@660 -- # es=1 00:06:23.758 06:48:53 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:23.758 00:06:23.758 real 0m0.282s 00:06:23.758 user 0m0.171s 00:06:23.758 sys 0m0.131s 00:06:23.758 06:48:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.758 06:48:53 -- common/autotest_common.sh@10 -- # set +x 00:06:23.758 ************************************ 00:06:23.758 END TEST accel_missing_filename 00:06:23.758 ************************************ 00:06:24.018 06:48:53 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.018 06:48:53 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:24.018 06:48:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:24.018 06:48:53 -- common/autotest_common.sh@10 -- # set +x 00:06:24.018 ************************************ 00:06:24.018 START TEST accel_compress_verify 00:06:24.018 ************************************ 00:06:24.018 06:48:53 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.018 06:48:53 -- common/autotest_common.sh@640 -- # local es=0 00:06:24.018 06:48:53 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.018 06:48:53 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:24.018 06:48:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:24.018 06:48:53 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:24.018 06:48:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:24.018 06:48:53 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.018 06:48:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.018 06:48:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.018 06:48:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.018 06:48:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.018 06:48:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.018 06:48:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.018 06:48:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.018 06:48:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.018 06:48:53 -- accel/accel.sh@42 -- # jq -r . 00:06:24.018 [2024-04-27 06:48:53.677171] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:24.018 [2024-04-27 06:48:53.677238] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607771 ] 00:06:24.018 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.018 [2024-04-27 06:48:53.739084] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.018 [2024-04-27 06:48:53.774811] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.018 [2024-04-27 06:48:53.814069] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:24.018 [2024-04-27 06:48:53.873651] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:24.277 00:06:24.278 Compression does not support the verify option, aborting. 00:06:24.278 06:48:53 -- common/autotest_common.sh@643 -- # es=161 00:06:24.278 06:48:53 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:24.278 06:48:53 -- common/autotest_common.sh@652 -- # es=33 00:06:24.278 06:48:53 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:24.278 06:48:53 -- common/autotest_common.sh@660 -- # es=1 00:06:24.278 06:48:53 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:24.278 00:06:24.278 real 0m0.268s 00:06:24.278 user 0m0.183s 00:06:24.278 sys 0m0.125s 00:06:24.278 06:48:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:24.278 06:48:53 -- common/autotest_common.sh@10 -- # set +x 00:06:24.278 ************************************ 00:06:24.278 END TEST accel_compress_verify 00:06:24.278 ************************************ 00:06:24.278 06:48:53 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:24.278 06:48:53 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:24.278 06:48:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:24.278 06:48:53 -- common/autotest_common.sh@10 -- # set +x 00:06:24.278 ************************************ 00:06:24.278 START TEST accel_wrong_workload 00:06:24.278 ************************************ 00:06:24.278 06:48:53 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:24.278 06:48:53 -- common/autotest_common.sh@640 -- # local es=0 00:06:24.278 06:48:53 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:24.278 06:48:53 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:24.278 06:48:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:24.278 06:48:53 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:24.278 06:48:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:24.278 06:48:53 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:24.278 06:48:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.278 06:48:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.278 06:48:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.278 06:48:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:24.278 06:48:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.278 06:48:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.278 06:48:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.278 06:48:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.278 06:48:53 -- accel/accel.sh@42 -- # jq -r . 00:06:24.278 Unsupported workload type: foobar 00:06:24.278 [2024-04-27 06:48:53.992237] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:24.278 accel_perf options: 00:06:24.278 [-h help message] 00:06:24.278 [-q queue depth per core] 00:06:24.278 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:24.278 [-T number of threads per core 00:06:24.278 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:24.278 [-t time in seconds] 00:06:24.278 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:24.278 [ dif_verify, , dif_generate, dif_generate_copy 00:06:24.278 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:24.278 [-l for compress/decompress workloads, name of uncompressed input file 00:06:24.278 [-S for crc32c workload, use this seed value (default 0) 00:06:24.278 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:24.278 [-f for fill workload, use this BYTE value (default 255) 00:06:24.278 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:24.278 [-y verify result if this switch is on] 00:06:24.278 [-a tasks to allocate per core (default: same value as -q)] 00:06:24.278 Can be used to spread operations across a wider range of memory. 00:06:24.278 06:48:53 -- common/autotest_common.sh@643 -- # es=1 00:06:24.278 06:48:53 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:24.278 06:48:53 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:24.278 06:48:53 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:24.278 00:06:24.278 real 0m0.023s 00:06:24.278 user 0m0.011s 00:06:24.278 sys 0m0.012s 00:06:24.278 06:48:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:24.278 06:48:53 -- common/autotest_common.sh@10 -- # set +x 00:06:24.278 ************************************ 00:06:24.278 END TEST accel_wrong_workload 00:06:24.278 ************************************ 00:06:24.278 Error: writing output failed: Broken pipe 00:06:24.278 06:48:54 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:24.278 06:48:54 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:24.278 06:48:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:24.278 06:48:54 -- common/autotest_common.sh@10 -- # set +x 00:06:24.278 ************************************ 00:06:24.278 START TEST accel_negative_buffers 00:06:24.278 ************************************ 00:06:24.278 06:48:54 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:24.278 06:48:54 -- common/autotest_common.sh@640 -- # local es=0 00:06:24.278 06:48:54 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:24.278 06:48:54 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:24.278 06:48:54 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:24.278 06:48:54 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:24.278 06:48:54 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:24.278 06:48:54 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:24.278 06:48:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.278 06:48:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:24.278 06:48:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.278 06:48:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.278 06:48:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.278 06:48:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.278 06:48:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.278 06:48:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.278 06:48:54 -- accel/accel.sh@42 -- # jq -r . 00:06:24.278 -x option must be non-negative. 00:06:24.278 [2024-04-27 06:48:54.066354] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:24.278 accel_perf options: 00:06:24.278 [-h help message] 00:06:24.278 [-q queue depth per core] 00:06:24.278 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:24.278 [-T number of threads per core 00:06:24.278 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:24.278 [-t time in seconds] 00:06:24.278 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:24.278 [ dif_verify, , dif_generate, dif_generate_copy 00:06:24.278 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:24.278 [-l for compress/decompress workloads, name of uncompressed input file 00:06:24.278 [-S for crc32c workload, use this seed value (default 0) 00:06:24.278 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:24.278 [-f for fill workload, use this BYTE value (default 255) 00:06:24.278 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:24.278 [-y verify result if this switch is on] 00:06:24.278 [-a tasks to allocate per core (default: same value as -q)] 00:06:24.278 Can be used to spread operations across a wider range of memory. 00:06:24.278 06:48:54 -- common/autotest_common.sh@643 -- # es=1 00:06:24.278 06:48:54 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:24.278 06:48:54 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:24.278 06:48:54 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:24.278 00:06:24.278 real 0m0.029s 00:06:24.278 user 0m0.013s 00:06:24.278 sys 0m0.016s 00:06:24.278 06:48:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:24.278 06:48:54 -- common/autotest_common.sh@10 -- # set +x 00:06:24.278 ************************************ 00:06:24.278 END TEST accel_negative_buffers 00:06:24.278 ************************************ 00:06:24.278 Error: writing output failed: Broken pipe 00:06:24.278 06:48:54 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:24.278 06:48:54 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:24.278 06:48:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:24.278 06:48:54 -- common/autotest_common.sh@10 -- # set +x 00:06:24.278 ************************************ 00:06:24.278 START TEST accel_crc32c 00:06:24.278 ************************************ 00:06:24.278 06:48:54 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:24.278 06:48:54 -- accel/accel.sh@16 -- # local accel_opc 00:06:24.278 06:48:54 -- accel/accel.sh@17 -- # local accel_module 00:06:24.278 06:48:54 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:24.278 06:48:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:24.278 06:48:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.278 06:48:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.278 06:48:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.279 06:48:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.279 06:48:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.279 06:48:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.279 06:48:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.279 06:48:54 -- accel/accel.sh@42 -- # jq -r . 00:06:24.279 [2024-04-27 06:48:54.130287] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:24.279 [2024-04-27 06:48:54.130337] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607843 ] 00:06:24.279 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.538 [2024-04-27 06:48:54.191930] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.538 [2024-04-27 06:48:54.229259] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.918 06:48:55 -- accel/accel.sh@18 -- # out=' 00:06:25.918 SPDK Configuration: 00:06:25.918 Core mask: 0x1 00:06:25.918 00:06:25.918 Accel Perf Configuration: 00:06:25.918 Workload Type: crc32c 00:06:25.918 CRC-32C seed: 32 00:06:25.918 Transfer size: 4096 bytes 00:06:25.918 Vector count 1 00:06:25.918 Module: software 00:06:25.918 Queue depth: 32 00:06:25.918 Allocate depth: 32 00:06:25.918 # threads/core: 1 00:06:25.918 Run time: 1 seconds 00:06:25.918 Verify: Yes 00:06:25.918 00:06:25.918 Running for 1 seconds... 00:06:25.918 00:06:25.918 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:25.918 ------------------------------------------------------------------------------------ 00:06:25.918 0,0 839936/s 3281 MiB/s 0 0 00:06:25.918 ==================================================================================== 00:06:25.918 Total 839936/s 3281 MiB/s 0 0' 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.918 06:48:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:25.918 06:48:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:25.918 06:48:55 -- accel/accel.sh@12 -- # build_accel_config 00:06:25.918 06:48:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:25.918 06:48:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:25.918 06:48:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:25.918 06:48:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:25.918 06:48:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:25.918 06:48:55 -- accel/accel.sh@41 -- # local IFS=, 00:06:25.918 06:48:55 -- accel/accel.sh@42 -- # jq -r . 00:06:25.918 [2024-04-27 06:48:55.409115] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:25.918 [2024-04-27 06:48:55.409207] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2608096 ] 00:06:25.918 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.918 [2024-04-27 06:48:55.479262] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.918 [2024-04-27 06:48:55.514065] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.918 06:48:55 -- accel/accel.sh@21 -- # val= 00:06:25.918 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.918 06:48:55 -- accel/accel.sh@21 -- # val= 00:06:25.918 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.918 06:48:55 -- accel/accel.sh@21 -- # val=0x1 00:06:25.918 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.918 06:48:55 -- accel/accel.sh@21 -- # val= 00:06:25.918 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.918 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.918 06:48:55 -- accel/accel.sh@21 -- # val= 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val=crc32c 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val=32 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val= 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val=software 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@23 -- # accel_module=software 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val=32 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val=32 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val=1 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val=Yes 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val= 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:25.919 06:48:55 -- accel/accel.sh@21 -- # val= 00:06:25.919 06:48:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # IFS=: 00:06:25.919 06:48:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.857 06:48:56 -- accel/accel.sh@21 -- # val= 00:06:26.857 06:48:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # IFS=: 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # read -r var val 00:06:26.857 06:48:56 -- accel/accel.sh@21 -- # val= 00:06:26.857 06:48:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # IFS=: 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # read -r var val 00:06:26.857 06:48:56 -- accel/accel.sh@21 -- # val= 00:06:26.857 06:48:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # IFS=: 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # read -r var val 00:06:26.857 06:48:56 -- accel/accel.sh@21 -- # val= 00:06:26.857 06:48:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # IFS=: 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # read -r var val 00:06:26.857 06:48:56 -- accel/accel.sh@21 -- # val= 00:06:26.857 06:48:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # IFS=: 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # read -r var val 00:06:26.857 06:48:56 -- accel/accel.sh@21 -- # val= 00:06:26.857 06:48:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # IFS=: 00:06:26.857 06:48:56 -- accel/accel.sh@20 -- # read -r var val 00:06:26.857 06:48:56 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:26.857 06:48:56 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:26.857 06:48:56 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:26.857 00:06:26.857 real 0m2.559s 00:06:26.857 user 0m2.315s 00:06:26.857 sys 0m0.253s 00:06:26.857 06:48:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.857 06:48:56 -- common/autotest_common.sh@10 -- # set +x 00:06:26.857 ************************************ 00:06:26.857 END TEST accel_crc32c 00:06:26.857 ************************************ 00:06:26.857 06:48:56 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:26.857 06:48:56 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:26.857 06:48:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:26.857 06:48:56 -- common/autotest_common.sh@10 -- # set +x 00:06:26.857 ************************************ 00:06:26.857 START TEST accel_crc32c_C2 00:06:26.857 ************************************ 00:06:26.857 06:48:56 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:26.857 06:48:56 -- accel/accel.sh@16 -- # local accel_opc 00:06:26.857 06:48:56 -- accel/accel.sh@17 -- # local accel_module 00:06:26.857 06:48:56 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:26.857 06:48:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.857 06:48:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:26.857 06:48:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.857 06:48:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.857 06:48:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.857 06:48:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.857 06:48:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.857 06:48:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.857 06:48:56 -- accel/accel.sh@42 -- # jq -r . 00:06:26.857 [2024-04-27 06:48:56.738833] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:26.857 [2024-04-27 06:48:56.738898] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2608379 ] 00:06:27.117 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.117 [2024-04-27 06:48:56.804423] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.117 [2024-04-27 06:48:56.840052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.496 06:48:57 -- accel/accel.sh@18 -- # out=' 00:06:28.496 SPDK Configuration: 00:06:28.496 Core mask: 0x1 00:06:28.496 00:06:28.496 Accel Perf Configuration: 00:06:28.496 Workload Type: crc32c 00:06:28.496 CRC-32C seed: 0 00:06:28.496 Transfer size: 4096 bytes 00:06:28.496 Vector count 2 00:06:28.496 Module: software 00:06:28.496 Queue depth: 32 00:06:28.496 Allocate depth: 32 00:06:28.496 # threads/core: 1 00:06:28.496 Run time: 1 seconds 00:06:28.496 Verify: Yes 00:06:28.496 00:06:28.496 Running for 1 seconds... 00:06:28.496 00:06:28.496 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:28.496 ------------------------------------------------------------------------------------ 00:06:28.496 0,0 610016/s 4765 MiB/s 0 0 00:06:28.496 ==================================================================================== 00:06:28.496 Total 610016/s 2382 MiB/s 0 0' 00:06:28.496 06:48:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:28.496 06:48:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:28.496 06:48:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.496 06:48:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:28.496 06:48:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.496 06:48:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.496 06:48:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:28.496 06:48:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:28.496 06:48:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:28.496 06:48:58 -- accel/accel.sh@42 -- # jq -r . 00:06:28.496 [2024-04-27 06:48:58.019072] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:28.496 [2024-04-27 06:48:58.019168] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2608645 ] 00:06:28.496 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.496 [2024-04-27 06:48:58.089325] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.496 [2024-04-27 06:48:58.123964] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val= 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val= 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val=0x1 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val= 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val= 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val=crc32c 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val=0 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val= 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val=software 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@23 -- # accel_module=software 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val=32 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val=32 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val=1 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val=Yes 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.496 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.496 06:48:58 -- accel/accel.sh@21 -- # val= 00:06:28.496 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.497 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.497 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:28.497 06:48:58 -- accel/accel.sh@21 -- # val= 00:06:28.497 06:48:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.497 06:48:58 -- accel/accel.sh@20 -- # IFS=: 00:06:28.497 06:48:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.435 06:48:59 -- accel/accel.sh@21 -- # val= 00:06:29.436 06:48:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # IFS=: 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # read -r var val 00:06:29.436 06:48:59 -- accel/accel.sh@21 -- # val= 00:06:29.436 06:48:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # IFS=: 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # read -r var val 00:06:29.436 06:48:59 -- accel/accel.sh@21 -- # val= 00:06:29.436 06:48:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # IFS=: 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # read -r var val 00:06:29.436 06:48:59 -- accel/accel.sh@21 -- # val= 00:06:29.436 06:48:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # IFS=: 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # read -r var val 00:06:29.436 06:48:59 -- accel/accel.sh@21 -- # val= 00:06:29.436 06:48:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # IFS=: 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # read -r var val 00:06:29.436 06:48:59 -- accel/accel.sh@21 -- # val= 00:06:29.436 06:48:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # IFS=: 00:06:29.436 06:48:59 -- accel/accel.sh@20 -- # read -r var val 00:06:29.436 06:48:59 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:29.436 06:48:59 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:29.436 06:48:59 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:29.436 00:06:29.436 real 0m2.564s 00:06:29.436 user 0m2.320s 00:06:29.436 sys 0m0.253s 00:06:29.436 06:48:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.436 06:48:59 -- common/autotest_common.sh@10 -- # set +x 00:06:29.436 ************************************ 00:06:29.436 END TEST accel_crc32c_C2 00:06:29.436 ************************************ 00:06:29.436 06:48:59 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:29.436 06:48:59 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:29.436 06:48:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:29.436 06:48:59 -- common/autotest_common.sh@10 -- # set +x 00:06:29.695 ************************************ 00:06:29.695 START TEST accel_copy 00:06:29.695 ************************************ 00:06:29.695 06:48:59 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:29.695 06:48:59 -- accel/accel.sh@16 -- # local accel_opc 00:06:29.695 06:48:59 -- accel/accel.sh@17 -- # local accel_module 00:06:29.695 06:48:59 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:29.695 06:48:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:29.695 06:48:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:29.695 06:48:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.695 06:48:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.695 06:48:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.695 06:48:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.695 06:48:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.695 06:48:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.695 06:48:59 -- accel/accel.sh@42 -- # jq -r . 00:06:29.695 [2024-04-27 06:48:59.347691] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:29.695 [2024-04-27 06:48:59.347759] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2608935 ] 00:06:29.695 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.695 [2024-04-27 06:48:59.412616] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.695 [2024-04-27 06:48:59.447919] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.078 06:49:00 -- accel/accel.sh@18 -- # out=' 00:06:31.078 SPDK Configuration: 00:06:31.078 Core mask: 0x1 00:06:31.078 00:06:31.078 Accel Perf Configuration: 00:06:31.078 Workload Type: copy 00:06:31.078 Transfer size: 4096 bytes 00:06:31.078 Vector count 1 00:06:31.078 Module: software 00:06:31.078 Queue depth: 32 00:06:31.078 Allocate depth: 32 00:06:31.078 # threads/core: 1 00:06:31.078 Run time: 1 seconds 00:06:31.078 Verify: Yes 00:06:31.078 00:06:31.078 Running for 1 seconds... 00:06:31.078 00:06:31.078 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:31.078 ------------------------------------------------------------------------------------ 00:06:31.078 0,0 551808/s 2155 MiB/s 0 0 00:06:31.078 ==================================================================================== 00:06:31.078 Total 551808/s 2155 MiB/s 0 0' 00:06:31.078 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.078 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.078 06:49:00 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:31.079 06:49:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:31.079 06:49:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.079 06:49:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.079 06:49:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.079 06:49:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.079 06:49:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.079 06:49:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.079 06:49:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.079 06:49:00 -- accel/accel.sh@42 -- # jq -r . 00:06:31.079 [2024-04-27 06:49:00.628483] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:31.079 [2024-04-27 06:49:00.628573] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2609099 ] 00:06:31.079 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.079 [2024-04-27 06:49:00.698576] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.079 [2024-04-27 06:49:00.733487] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val= 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val= 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val=0x1 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val= 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val= 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val=copy 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val= 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val=software 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@23 -- # accel_module=software 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val=32 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val=32 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val=1 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val=Yes 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val= 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:31.079 06:49:00 -- accel/accel.sh@21 -- # val= 00:06:31.079 06:49:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # IFS=: 00:06:31.079 06:49:00 -- accel/accel.sh@20 -- # read -r var val 00:06:32.082 06:49:01 -- accel/accel.sh@21 -- # val= 00:06:32.082 06:49:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.082 06:49:01 -- accel/accel.sh@21 -- # val= 00:06:32.082 06:49:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.082 06:49:01 -- accel/accel.sh@21 -- # val= 00:06:32.082 06:49:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.082 06:49:01 -- accel/accel.sh@21 -- # val= 00:06:32.082 06:49:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.082 06:49:01 -- accel/accel.sh@21 -- # val= 00:06:32.082 06:49:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.082 06:49:01 -- accel/accel.sh@21 -- # val= 00:06:32.082 06:49:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.082 06:49:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.082 06:49:01 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:32.082 06:49:01 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:32.082 06:49:01 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:32.082 00:06:32.082 real 0m2.563s 00:06:32.082 user 0m2.324s 00:06:32.082 sys 0m0.247s 00:06:32.082 06:49:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.082 06:49:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.082 ************************************ 00:06:32.082 END TEST accel_copy 00:06:32.082 ************************************ 00:06:32.082 06:49:01 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:32.082 06:49:01 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:32.082 06:49:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:32.082 06:49:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.082 ************************************ 00:06:32.082 START TEST accel_fill 00:06:32.082 ************************************ 00:06:32.082 06:49:01 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:32.082 06:49:01 -- accel/accel.sh@16 -- # local accel_opc 00:06:32.082 06:49:01 -- accel/accel.sh@17 -- # local accel_module 00:06:32.082 06:49:01 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:32.083 06:49:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:32.083 06:49:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.083 06:49:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.083 06:49:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.083 06:49:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.083 06:49:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.083 06:49:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.083 06:49:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.083 06:49:01 -- accel/accel.sh@42 -- # jq -r . 00:06:32.083 [2024-04-27 06:49:01.970258] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:32.083 [2024-04-27 06:49:01.970355] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2609283 ] 00:06:32.342 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.342 [2024-04-27 06:49:02.041976] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.342 [2024-04-27 06:49:02.078378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.731 06:49:03 -- accel/accel.sh@18 -- # out=' 00:06:33.731 SPDK Configuration: 00:06:33.731 Core mask: 0x1 00:06:33.731 00:06:33.731 Accel Perf Configuration: 00:06:33.731 Workload Type: fill 00:06:33.731 Fill pattern: 0x80 00:06:33.731 Transfer size: 4096 bytes 00:06:33.731 Vector count 1 00:06:33.731 Module: software 00:06:33.731 Queue depth: 64 00:06:33.731 Allocate depth: 64 00:06:33.731 # threads/core: 1 00:06:33.731 Run time: 1 seconds 00:06:33.731 Verify: Yes 00:06:33.731 00:06:33.731 Running for 1 seconds... 00:06:33.731 00:06:33.731 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:33.731 ------------------------------------------------------------------------------------ 00:06:33.731 0,0 969408/s 3786 MiB/s 0 0 00:06:33.731 ==================================================================================== 00:06:33.731 Total 969408/s 3786 MiB/s 0 0' 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:33.731 06:49:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:33.731 06:49:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:33.731 06:49:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:33.731 06:49:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.731 06:49:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.731 06:49:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:33.731 06:49:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:33.731 06:49:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:33.731 06:49:03 -- accel/accel.sh@42 -- # jq -r . 00:06:33.731 [2024-04-27 06:49:03.256925] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:33.731 [2024-04-27 06:49:03.257015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2609510 ] 00:06:33.731 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.731 [2024-04-27 06:49:03.326263] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.731 [2024-04-27 06:49:03.361126] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val= 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val= 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val=0x1 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val= 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val= 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val=fill 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val=0x80 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val= 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val=software 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@23 -- # accel_module=software 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val=64 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val=64 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val=1 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val=Yes 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val= 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:33.731 06:49:03 -- accel/accel.sh@21 -- # val= 00:06:33.731 06:49:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # IFS=: 00:06:33.731 06:49:03 -- accel/accel.sh@20 -- # read -r var val 00:06:34.668 06:49:04 -- accel/accel.sh@21 -- # val= 00:06:34.668 06:49:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.668 06:49:04 -- accel/accel.sh@21 -- # val= 00:06:34.668 06:49:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.668 06:49:04 -- accel/accel.sh@21 -- # val= 00:06:34.668 06:49:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.668 06:49:04 -- accel/accel.sh@21 -- # val= 00:06:34.668 06:49:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.668 06:49:04 -- accel/accel.sh@21 -- # val= 00:06:34.668 06:49:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.668 06:49:04 -- accel/accel.sh@21 -- # val= 00:06:34.668 06:49:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.668 06:49:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.668 06:49:04 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:34.668 06:49:04 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:34.668 06:49:04 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:34.668 00:06:34.668 real 0m2.578s 00:06:34.668 user 0m2.320s 00:06:34.668 sys 0m0.266s 00:06:34.668 06:49:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.668 06:49:04 -- common/autotest_common.sh@10 -- # set +x 00:06:34.669 ************************************ 00:06:34.669 END TEST accel_fill 00:06:34.669 ************************************ 00:06:34.928 06:49:04 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:34.928 06:49:04 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:34.928 06:49:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:34.928 06:49:04 -- common/autotest_common.sh@10 -- # set +x 00:06:34.928 ************************************ 00:06:34.928 START TEST accel_copy_crc32c 00:06:34.928 ************************************ 00:06:34.928 06:49:04 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:34.928 06:49:04 -- accel/accel.sh@16 -- # local accel_opc 00:06:34.928 06:49:04 -- accel/accel.sh@17 -- # local accel_module 00:06:34.928 06:49:04 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:34.928 06:49:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:34.928 06:49:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:34.928 06:49:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.928 06:49:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.928 06:49:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.928 06:49:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.928 06:49:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.928 06:49:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.928 06:49:04 -- accel/accel.sh@42 -- # jq -r . 00:06:34.928 [2024-04-27 06:49:04.591080] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:34.928 [2024-04-27 06:49:04.591171] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2609801 ] 00:06:34.928 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.928 [2024-04-27 06:49:04.660374] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.928 [2024-04-27 06:49:04.695667] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.309 06:49:05 -- accel/accel.sh@18 -- # out=' 00:06:36.309 SPDK Configuration: 00:06:36.309 Core mask: 0x1 00:06:36.309 00:06:36.309 Accel Perf Configuration: 00:06:36.309 Workload Type: copy_crc32c 00:06:36.309 CRC-32C seed: 0 00:06:36.309 Vector size: 4096 bytes 00:06:36.309 Transfer size: 4096 bytes 00:06:36.309 Vector count 1 00:06:36.309 Module: software 00:06:36.309 Queue depth: 32 00:06:36.309 Allocate depth: 32 00:06:36.309 # threads/core: 1 00:06:36.309 Run time: 1 seconds 00:06:36.309 Verify: Yes 00:06:36.309 00:06:36.309 Running for 1 seconds... 00:06:36.309 00:06:36.309 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:36.309 ------------------------------------------------------------------------------------ 00:06:36.309 0,0 422336/s 1649 MiB/s 0 0 00:06:36.309 ==================================================================================== 00:06:36.309 Total 422336/s 1649 MiB/s 0 0' 00:06:36.309 06:49:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.309 06:49:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.309 06:49:05 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:36.309 06:49:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:36.309 06:49:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.309 06:49:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.309 06:49:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.309 06:49:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.309 06:49:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.309 06:49:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.309 06:49:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.309 06:49:05 -- accel/accel.sh@42 -- # jq -r . 00:06:36.309 [2024-04-27 06:49:05.874068] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:36.309 [2024-04-27 06:49:05.874159] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2610069 ] 00:06:36.309 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.310 [2024-04-27 06:49:05.943328] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.310 [2024-04-27 06:49:05.977840] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val= 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val= 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val=0x1 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val= 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val= 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val=0 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val= 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val=software 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@23 -- # accel_module=software 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val=32 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val=32 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val=1 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val=Yes 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val= 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:36.310 06:49:06 -- accel/accel.sh@21 -- # val= 00:06:36.310 06:49:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # IFS=: 00:06:36.310 06:49:06 -- accel/accel.sh@20 -- # read -r var val 00:06:37.248 06:49:07 -- accel/accel.sh@21 -- # val= 00:06:37.248 06:49:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # IFS=: 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # read -r var val 00:06:37.248 06:49:07 -- accel/accel.sh@21 -- # val= 00:06:37.248 06:49:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # IFS=: 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # read -r var val 00:06:37.248 06:49:07 -- accel/accel.sh@21 -- # val= 00:06:37.248 06:49:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # IFS=: 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # read -r var val 00:06:37.248 06:49:07 -- accel/accel.sh@21 -- # val= 00:06:37.248 06:49:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # IFS=: 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # read -r var val 00:06:37.248 06:49:07 -- accel/accel.sh@21 -- # val= 00:06:37.248 06:49:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # IFS=: 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # read -r var val 00:06:37.248 06:49:07 -- accel/accel.sh@21 -- # val= 00:06:37.248 06:49:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # IFS=: 00:06:37.248 06:49:07 -- accel/accel.sh@20 -- # read -r var val 00:06:37.248 06:49:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:37.248 06:49:07 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:37.248 06:49:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:37.248 00:06:37.248 real 0m2.572s 00:06:37.248 user 0m2.326s 00:06:37.248 sys 0m0.256s 00:06:37.248 06:49:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.248 06:49:07 -- common/autotest_common.sh@10 -- # set +x 00:06:37.248 ************************************ 00:06:37.248 END TEST accel_copy_crc32c 00:06:37.248 ************************************ 00:06:37.508 06:49:07 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:37.508 06:49:07 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:37.508 06:49:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.508 06:49:07 -- common/autotest_common.sh@10 -- # set +x 00:06:37.508 ************************************ 00:06:37.508 START TEST accel_copy_crc32c_C2 00:06:37.508 ************************************ 00:06:37.508 06:49:07 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:37.508 06:49:07 -- accel/accel.sh@16 -- # local accel_opc 00:06:37.508 06:49:07 -- accel/accel.sh@17 -- # local accel_module 00:06:37.508 06:49:07 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:37.508 06:49:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.508 06:49:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:37.508 06:49:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.508 06:49:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.508 06:49:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.508 06:49:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.508 06:49:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.508 06:49:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.508 06:49:07 -- accel/accel.sh@42 -- # jq -r . 00:06:37.508 [2024-04-27 06:49:07.210985] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:37.508 [2024-04-27 06:49:07.211084] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2610350 ] 00:06:37.508 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.508 [2024-04-27 06:49:07.279876] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.508 [2024-04-27 06:49:07.315538] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.887 06:49:08 -- accel/accel.sh@18 -- # out=' 00:06:38.887 SPDK Configuration: 00:06:38.887 Core mask: 0x1 00:06:38.887 00:06:38.887 Accel Perf Configuration: 00:06:38.887 Workload Type: copy_crc32c 00:06:38.887 CRC-32C seed: 0 00:06:38.887 Vector size: 4096 bytes 00:06:38.887 Transfer size: 8192 bytes 00:06:38.887 Vector count 2 00:06:38.887 Module: software 00:06:38.887 Queue depth: 32 00:06:38.887 Allocate depth: 32 00:06:38.887 # threads/core: 1 00:06:38.887 Run time: 1 seconds 00:06:38.887 Verify: Yes 00:06:38.887 00:06:38.887 Running for 1 seconds... 00:06:38.887 00:06:38.887 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:38.887 ------------------------------------------------------------------------------------ 00:06:38.887 0,0 298016/s 2328 MiB/s 0 0 00:06:38.887 ==================================================================================== 00:06:38.887 Total 298016/s 1164 MiB/s 0 0' 00:06:38.887 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.887 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.887 06:49:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:38.887 06:49:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:38.887 06:49:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.887 06:49:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.887 06:49:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.887 06:49:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.887 06:49:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.887 06:49:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.887 06:49:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.887 06:49:08 -- accel/accel.sh@42 -- # jq -r . 00:06:38.887 [2024-04-27 06:49:08.494935] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:38.887 [2024-04-27 06:49:08.495041] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2610586 ] 00:06:38.887 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.887 [2024-04-27 06:49:08.564567] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.887 [2024-04-27 06:49:08.600317] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.887 06:49:08 -- accel/accel.sh@21 -- # val= 00:06:38.887 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.887 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val= 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val=0x1 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val= 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val= 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val=0 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val= 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val=software 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@23 -- # accel_module=software 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val=32 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val=32 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val=1 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val=Yes 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val= 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.888 06:49:08 -- accel/accel.sh@21 -- # val= 00:06:38.888 06:49:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.888 06:49:08 -- accel/accel.sh@20 -- # read -r var val 00:06:40.267 06:49:09 -- accel/accel.sh@21 -- # val= 00:06:40.267 06:49:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.267 06:49:09 -- accel/accel.sh@21 -- # val= 00:06:40.267 06:49:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.267 06:49:09 -- accel/accel.sh@21 -- # val= 00:06:40.267 06:49:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.267 06:49:09 -- accel/accel.sh@21 -- # val= 00:06:40.267 06:49:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.267 06:49:09 -- accel/accel.sh@21 -- # val= 00:06:40.267 06:49:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.267 06:49:09 -- accel/accel.sh@21 -- # val= 00:06:40.267 06:49:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.267 06:49:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.267 06:49:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:40.267 06:49:09 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:40.267 06:49:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.267 00:06:40.267 real 0m2.574s 00:06:40.267 user 0m2.324s 00:06:40.267 sys 0m0.261s 00:06:40.267 06:49:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.267 06:49:09 -- common/autotest_common.sh@10 -- # set +x 00:06:40.267 ************************************ 00:06:40.267 END TEST accel_copy_crc32c_C2 00:06:40.267 ************************************ 00:06:40.267 06:49:09 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:40.267 06:49:09 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:40.267 06:49:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:40.267 06:49:09 -- common/autotest_common.sh@10 -- # set +x 00:06:40.267 ************************************ 00:06:40.267 START TEST accel_dualcast 00:06:40.267 ************************************ 00:06:40.267 06:49:09 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:40.267 06:49:09 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.267 06:49:09 -- accel/accel.sh@17 -- # local accel_module 00:06:40.267 06:49:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:40.267 06:49:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:40.267 06:49:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.267 06:49:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.267 06:49:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.267 06:49:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.267 06:49:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.267 06:49:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.267 06:49:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.267 06:49:09 -- accel/accel.sh@42 -- # jq -r . 00:06:40.267 [2024-04-27 06:49:09.834391] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:40.267 [2024-04-27 06:49:09.834490] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2610786 ] 00:06:40.267 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.267 [2024-04-27 06:49:09.904893] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.268 [2024-04-27 06:49:09.940350] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.206 06:49:11 -- accel/accel.sh@18 -- # out=' 00:06:41.206 SPDK Configuration: 00:06:41.206 Core mask: 0x1 00:06:41.206 00:06:41.206 Accel Perf Configuration: 00:06:41.206 Workload Type: dualcast 00:06:41.206 Transfer size: 4096 bytes 00:06:41.206 Vector count 1 00:06:41.206 Module: software 00:06:41.206 Queue depth: 32 00:06:41.206 Allocate depth: 32 00:06:41.206 # threads/core: 1 00:06:41.206 Run time: 1 seconds 00:06:41.206 Verify: Yes 00:06:41.206 00:06:41.206 Running for 1 seconds... 00:06:41.206 00:06:41.206 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:41.206 ------------------------------------------------------------------------------------ 00:06:41.206 0,0 619808/s 2421 MiB/s 0 0 00:06:41.206 ==================================================================================== 00:06:41.206 Total 619808/s 2421 MiB/s 0 0' 00:06:41.206 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.206 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.206 06:49:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:41.206 06:49:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:41.206 06:49:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.206 06:49:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.465 06:49:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.465 06:49:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.465 06:49:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.465 06:49:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.465 06:49:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.465 06:49:11 -- accel/accel.sh@42 -- # jq -r . 00:06:41.465 [2024-04-27 06:49:11.117925] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:41.465 [2024-04-27 06:49:11.118015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2610937 ] 00:06:41.465 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.465 [2024-04-27 06:49:11.189024] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.465 [2024-04-27 06:49:11.223900] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val= 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val= 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val=0x1 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val= 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val= 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val=dualcast 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val= 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val=software 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@23 -- # accel_module=software 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val=32 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val=32 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val=1 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val=Yes 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val= 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.465 06:49:11 -- accel/accel.sh@21 -- # val= 00:06:41.465 06:49:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.465 06:49:11 -- accel/accel.sh@20 -- # read -r var val 00:06:42.842 06:49:12 -- accel/accel.sh@21 -- # val= 00:06:42.842 06:49:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.842 06:49:12 -- accel/accel.sh@21 -- # val= 00:06:42.842 06:49:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.842 06:49:12 -- accel/accel.sh@21 -- # val= 00:06:42.842 06:49:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.842 06:49:12 -- accel/accel.sh@21 -- # val= 00:06:42.842 06:49:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.842 06:49:12 -- accel/accel.sh@21 -- # val= 00:06:42.842 06:49:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.842 06:49:12 -- accel/accel.sh@21 -- # val= 00:06:42.842 06:49:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.842 06:49:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.842 06:49:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:42.842 06:49:12 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:42.842 06:49:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:42.842 00:06:42.842 real 0m2.572s 00:06:42.842 user 0m2.321s 00:06:42.842 sys 0m0.259s 00:06:42.842 06:49:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:42.842 06:49:12 -- common/autotest_common.sh@10 -- # set +x 00:06:42.843 ************************************ 00:06:42.843 END TEST accel_dualcast 00:06:42.843 ************************************ 00:06:42.843 06:49:12 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:42.843 06:49:12 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:42.843 06:49:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:42.843 06:49:12 -- common/autotest_common.sh@10 -- # set +x 00:06:42.843 ************************************ 00:06:42.843 START TEST accel_compare 00:06:42.843 ************************************ 00:06:42.843 06:49:12 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:06:42.843 06:49:12 -- accel/accel.sh@16 -- # local accel_opc 00:06:42.843 06:49:12 -- accel/accel.sh@17 -- # local accel_module 00:06:42.843 06:49:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:42.843 06:49:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:42.843 06:49:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.843 06:49:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.843 06:49:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.843 06:49:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.843 06:49:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.843 06:49:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.843 06:49:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.843 06:49:12 -- accel/accel.sh@42 -- # jq -r . 00:06:42.843 [2024-04-27 06:49:12.454133] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:42.843 [2024-04-27 06:49:12.454229] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2611213 ] 00:06:42.843 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.843 [2024-04-27 06:49:12.523497] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.843 [2024-04-27 06:49:12.558996] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.220 06:49:13 -- accel/accel.sh@18 -- # out=' 00:06:44.220 SPDK Configuration: 00:06:44.220 Core mask: 0x1 00:06:44.220 00:06:44.220 Accel Perf Configuration: 00:06:44.220 Workload Type: compare 00:06:44.220 Transfer size: 4096 bytes 00:06:44.220 Vector count 1 00:06:44.220 Module: software 00:06:44.220 Queue depth: 32 00:06:44.220 Allocate depth: 32 00:06:44.220 # threads/core: 1 00:06:44.220 Run time: 1 seconds 00:06:44.220 Verify: Yes 00:06:44.220 00:06:44.220 Running for 1 seconds... 00:06:44.220 00:06:44.220 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:44.220 ------------------------------------------------------------------------------------ 00:06:44.220 0,0 827552/s 3232 MiB/s 0 0 00:06:44.220 ==================================================================================== 00:06:44.220 Total 827552/s 3232 MiB/s 0 0' 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:44.220 06:49:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:44.220 06:49:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.220 06:49:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.220 06:49:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.220 06:49:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.220 06:49:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.220 06:49:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.220 06:49:13 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.220 06:49:13 -- accel/accel.sh@42 -- # jq -r . 00:06:44.220 [2024-04-27 06:49:13.736676] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:44.220 [2024-04-27 06:49:13.736768] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2611487 ] 00:06:44.220 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.220 [2024-04-27 06:49:13.805834] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.220 [2024-04-27 06:49:13.840145] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val= 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val= 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val=0x1 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val= 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val= 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val=compare 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val= 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val=software 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@23 -- # accel_module=software 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val=32 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val=32 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val=1 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.220 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.220 06:49:13 -- accel/accel.sh@21 -- # val=Yes 00:06:44.220 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.221 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.221 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.221 06:49:13 -- accel/accel.sh@21 -- # val= 00:06:44.221 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.221 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.221 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.221 06:49:13 -- accel/accel.sh@21 -- # val= 00:06:44.221 06:49:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.221 06:49:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.221 06:49:13 -- accel/accel.sh@20 -- # read -r var val 00:06:45.157 06:49:14 -- accel/accel.sh@21 -- # val= 00:06:45.157 06:49:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.157 06:49:14 -- accel/accel.sh@21 -- # val= 00:06:45.157 06:49:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.157 06:49:14 -- accel/accel.sh@21 -- # val= 00:06:45.157 06:49:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.157 06:49:14 -- accel/accel.sh@21 -- # val= 00:06:45.157 06:49:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.157 06:49:14 -- accel/accel.sh@21 -- # val= 00:06:45.157 06:49:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.157 06:49:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.157 06:49:15 -- accel/accel.sh@20 -- # read -r var val 00:06:45.157 06:49:15 -- accel/accel.sh@21 -- # val= 00:06:45.157 06:49:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.157 06:49:15 -- accel/accel.sh@20 -- # IFS=: 00:06:45.157 06:49:15 -- accel/accel.sh@20 -- # read -r var val 00:06:45.157 06:49:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:45.157 06:49:15 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:45.157 06:49:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.157 00:06:45.157 real 0m2.570s 00:06:45.157 user 0m2.312s 00:06:45.157 sys 0m0.266s 00:06:45.157 06:49:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.157 06:49:15 -- common/autotest_common.sh@10 -- # set +x 00:06:45.157 ************************************ 00:06:45.157 END TEST accel_compare 00:06:45.157 ************************************ 00:06:45.157 06:49:15 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:45.157 06:49:15 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:45.157 06:49:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:45.157 06:49:15 -- common/autotest_common.sh@10 -- # set +x 00:06:45.157 ************************************ 00:06:45.157 START TEST accel_xor 00:06:45.157 ************************************ 00:06:45.157 06:49:15 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:06:45.157 06:49:15 -- accel/accel.sh@16 -- # local accel_opc 00:06:45.157 06:49:15 -- accel/accel.sh@17 -- # local accel_module 00:06:45.416 06:49:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:45.416 06:49:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:45.416 06:49:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.416 06:49:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.416 06:49:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.416 06:49:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.416 06:49:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.416 06:49:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.416 06:49:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.416 06:49:15 -- accel/accel.sh@42 -- # jq -r . 00:06:45.416 [2024-04-27 06:49:15.071689] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:45.416 [2024-04-27 06:49:15.071785] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2611768 ] 00:06:45.416 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.416 [2024-04-27 06:49:15.140840] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.416 [2024-04-27 06:49:15.176106] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.796 06:49:16 -- accel/accel.sh@18 -- # out=' 00:06:46.796 SPDK Configuration: 00:06:46.796 Core mask: 0x1 00:06:46.796 00:06:46.796 Accel Perf Configuration: 00:06:46.796 Workload Type: xor 00:06:46.796 Source buffers: 2 00:06:46.796 Transfer size: 4096 bytes 00:06:46.796 Vector count 1 00:06:46.796 Module: software 00:06:46.796 Queue depth: 32 00:06:46.796 Allocate depth: 32 00:06:46.796 # threads/core: 1 00:06:46.796 Run time: 1 seconds 00:06:46.796 Verify: Yes 00:06:46.796 00:06:46.796 Running for 1 seconds... 00:06:46.796 00:06:46.796 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:46.796 ------------------------------------------------------------------------------------ 00:06:46.796 0,0 680928/s 2659 MiB/s 0 0 00:06:46.796 ==================================================================================== 00:06:46.796 Total 680928/s 2659 MiB/s 0 0' 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:46.796 06:49:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:46.796 06:49:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.796 06:49:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.796 06:49:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.796 06:49:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.796 06:49:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.796 06:49:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.796 06:49:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.796 06:49:16 -- accel/accel.sh@42 -- # jq -r . 00:06:46.796 [2024-04-27 06:49:16.355222] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:46.796 [2024-04-27 06:49:16.355315] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2612034 ] 00:06:46.796 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.796 [2024-04-27 06:49:16.424556] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.796 [2024-04-27 06:49:16.459326] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val= 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val= 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val=0x1 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val= 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val= 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val=xor 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val=2 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.796 06:49:16 -- accel/accel.sh@21 -- # val= 00:06:46.796 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.796 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.797 06:49:16 -- accel/accel.sh@21 -- # val=software 00:06:46.797 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.797 06:49:16 -- accel/accel.sh@23 -- # accel_module=software 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.797 06:49:16 -- accel/accel.sh@21 -- # val=32 00:06:46.797 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.797 06:49:16 -- accel/accel.sh@21 -- # val=32 00:06:46.797 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.797 06:49:16 -- accel/accel.sh@21 -- # val=1 00:06:46.797 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.797 06:49:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:46.797 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.797 06:49:16 -- accel/accel.sh@21 -- # val=Yes 00:06:46.797 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.797 06:49:16 -- accel/accel.sh@21 -- # val= 00:06:46.797 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.797 06:49:16 -- accel/accel.sh@21 -- # val= 00:06:46.797 06:49:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # IFS=: 00:06:46.797 06:49:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.734 06:49:17 -- accel/accel.sh@21 -- # val= 00:06:47.734 06:49:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.734 06:49:17 -- accel/accel.sh@20 -- # IFS=: 00:06:47.734 06:49:17 -- accel/accel.sh@20 -- # read -r var val 00:06:47.734 06:49:17 -- accel/accel.sh@21 -- # val= 00:06:47.734 06:49:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.734 06:49:17 -- accel/accel.sh@20 -- # IFS=: 00:06:47.734 06:49:17 -- accel/accel.sh@20 -- # read -r var val 00:06:47.734 06:49:17 -- accel/accel.sh@21 -- # val= 00:06:47.734 06:49:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.734 06:49:17 -- accel/accel.sh@20 -- # IFS=: 00:06:47.734 06:49:17 -- accel/accel.sh@20 -- # read -r var val 00:06:47.734 06:49:17 -- accel/accel.sh@21 -- # val= 00:06:47.734 06:49:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.735 06:49:17 -- accel/accel.sh@20 -- # IFS=: 00:06:47.735 06:49:17 -- accel/accel.sh@20 -- # read -r var val 00:06:47.735 06:49:17 -- accel/accel.sh@21 -- # val= 00:06:47.735 06:49:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.735 06:49:17 -- accel/accel.sh@20 -- # IFS=: 00:06:47.735 06:49:17 -- accel/accel.sh@20 -- # read -r var val 00:06:47.735 06:49:17 -- accel/accel.sh@21 -- # val= 00:06:47.735 06:49:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.735 06:49:17 -- accel/accel.sh@20 -- # IFS=: 00:06:47.735 06:49:17 -- accel/accel.sh@20 -- # read -r var val 00:06:47.735 06:49:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:47.735 06:49:17 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:47.735 06:49:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.735 00:06:47.735 real 0m2.574s 00:06:47.735 user 0m2.310s 00:06:47.735 sys 0m0.272s 00:06:47.735 06:49:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.735 06:49:17 -- common/autotest_common.sh@10 -- # set +x 00:06:47.735 ************************************ 00:06:47.735 END TEST accel_xor 00:06:47.735 ************************************ 00:06:47.994 06:49:17 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:47.994 06:49:17 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:47.994 06:49:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:47.994 06:49:17 -- common/autotest_common.sh@10 -- # set +x 00:06:47.994 ************************************ 00:06:47.994 START TEST accel_xor 00:06:47.994 ************************************ 00:06:47.994 06:49:17 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:06:47.994 06:49:17 -- accel/accel.sh@16 -- # local accel_opc 00:06:47.994 06:49:17 -- accel/accel.sh@17 -- # local accel_module 00:06:47.994 06:49:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:47.994 06:49:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:47.994 06:49:17 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.994 06:49:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.994 06:49:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.994 06:49:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.994 06:49:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.994 06:49:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.994 06:49:17 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.994 06:49:17 -- accel/accel.sh@42 -- # jq -r . 00:06:47.994 [2024-04-27 06:49:17.696160] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:47.994 [2024-04-27 06:49:17.696251] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2612236 ] 00:06:47.994 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.994 [2024-04-27 06:49:17.768060] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.994 [2024-04-27 06:49:17.803793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.374 06:49:18 -- accel/accel.sh@18 -- # out=' 00:06:49.374 SPDK Configuration: 00:06:49.374 Core mask: 0x1 00:06:49.374 00:06:49.374 Accel Perf Configuration: 00:06:49.374 Workload Type: xor 00:06:49.374 Source buffers: 3 00:06:49.374 Transfer size: 4096 bytes 00:06:49.374 Vector count 1 00:06:49.374 Module: software 00:06:49.374 Queue depth: 32 00:06:49.374 Allocate depth: 32 00:06:49.374 # threads/core: 1 00:06:49.374 Run time: 1 seconds 00:06:49.374 Verify: Yes 00:06:49.374 00:06:49.374 Running for 1 seconds... 00:06:49.374 00:06:49.374 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:49.374 ------------------------------------------------------------------------------------ 00:06:49.374 0,0 648896/s 2534 MiB/s 0 0 00:06:49.374 ==================================================================================== 00:06:49.374 Total 648896/s 2534 MiB/s 0 0' 00:06:49.374 06:49:18 -- accel/accel.sh@20 -- # IFS=: 00:06:49.374 06:49:18 -- accel/accel.sh@20 -- # read -r var val 00:06:49.374 06:49:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:49.374 06:49:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:49.374 06:49:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.374 06:49:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.374 06:49:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.374 06:49:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.374 06:49:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.374 06:49:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.374 06:49:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.374 06:49:18 -- accel/accel.sh@42 -- # jq -r . 00:06:49.374 [2024-04-27 06:49:18.983610] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:49.374 [2024-04-27 06:49:18.983702] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2612382 ] 00:06:49.374 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.374 [2024-04-27 06:49:19.052816] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.374 [2024-04-27 06:49:19.087681] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.374 06:49:19 -- accel/accel.sh@21 -- # val= 00:06:49.374 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.374 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.374 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.374 06:49:19 -- accel/accel.sh@21 -- # val= 00:06:49.374 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.374 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.374 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.374 06:49:19 -- accel/accel.sh@21 -- # val=0x1 00:06:49.374 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.374 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.374 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.374 06:49:19 -- accel/accel.sh@21 -- # val= 00:06:49.374 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.374 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.374 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.374 06:49:19 -- accel/accel.sh@21 -- # val= 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val=xor 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val=3 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val= 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val=software 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@23 -- # accel_module=software 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val=32 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val=32 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val=1 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val=Yes 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val= 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.375 06:49:19 -- accel/accel.sh@21 -- # val= 00:06:49.375 06:49:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.375 06:49:19 -- accel/accel.sh@20 -- # read -r var val 00:06:50.751 06:49:20 -- accel/accel.sh@21 -- # val= 00:06:50.751 06:49:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # IFS=: 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # read -r var val 00:06:50.751 06:49:20 -- accel/accel.sh@21 -- # val= 00:06:50.751 06:49:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # IFS=: 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # read -r var val 00:06:50.751 06:49:20 -- accel/accel.sh@21 -- # val= 00:06:50.751 06:49:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # IFS=: 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # read -r var val 00:06:50.751 06:49:20 -- accel/accel.sh@21 -- # val= 00:06:50.751 06:49:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # IFS=: 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # read -r var val 00:06:50.751 06:49:20 -- accel/accel.sh@21 -- # val= 00:06:50.751 06:49:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # IFS=: 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # read -r var val 00:06:50.751 06:49:20 -- accel/accel.sh@21 -- # val= 00:06:50.751 06:49:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # IFS=: 00:06:50.751 06:49:20 -- accel/accel.sh@20 -- # read -r var val 00:06:50.751 06:49:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:50.751 06:49:20 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:50.751 06:49:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:50.751 00:06:50.751 real 0m2.582s 00:06:50.751 user 0m2.328s 00:06:50.751 sys 0m0.261s 00:06:50.751 06:49:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.751 06:49:20 -- common/autotest_common.sh@10 -- # set +x 00:06:50.751 ************************************ 00:06:50.751 END TEST accel_xor 00:06:50.751 ************************************ 00:06:50.751 06:49:20 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:50.751 06:49:20 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:50.751 06:49:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:50.751 06:49:20 -- common/autotest_common.sh@10 -- # set +x 00:06:50.751 ************************************ 00:06:50.751 START TEST accel_dif_verify 00:06:50.751 ************************************ 00:06:50.751 06:49:20 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:06:50.751 06:49:20 -- accel/accel.sh@16 -- # local accel_opc 00:06:50.751 06:49:20 -- accel/accel.sh@17 -- # local accel_module 00:06:50.751 06:49:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:50.751 06:49:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:50.751 06:49:20 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.751 06:49:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.751 06:49:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.751 06:49:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.751 06:49:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.751 06:49:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.751 06:49:20 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.751 06:49:20 -- accel/accel.sh@42 -- # jq -r . 00:06:50.751 [2024-04-27 06:49:20.324530] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:50.751 [2024-04-27 06:49:20.324627] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2612631 ] 00:06:50.751 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.751 [2024-04-27 06:49:20.395317] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.751 [2024-04-27 06:49:20.430906] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.131 06:49:21 -- accel/accel.sh@18 -- # out=' 00:06:52.131 SPDK Configuration: 00:06:52.131 Core mask: 0x1 00:06:52.131 00:06:52.131 Accel Perf Configuration: 00:06:52.131 Workload Type: dif_verify 00:06:52.131 Vector size: 4096 bytes 00:06:52.131 Transfer size: 4096 bytes 00:06:52.131 Block size: 512 bytes 00:06:52.131 Metadata size: 8 bytes 00:06:52.131 Vector count 1 00:06:52.131 Module: software 00:06:52.131 Queue depth: 32 00:06:52.131 Allocate depth: 32 00:06:52.131 # threads/core: 1 00:06:52.131 Run time: 1 seconds 00:06:52.131 Verify: No 00:06:52.131 00:06:52.131 Running for 1 seconds... 00:06:52.131 00:06:52.131 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:52.131 ------------------------------------------------------------------------------------ 00:06:52.131 0,0 241824/s 959 MiB/s 0 0 00:06:52.131 ==================================================================================== 00:06:52.131 Total 241824/s 944 MiB/s 0 0' 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:52.131 06:49:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:52.131 06:49:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.131 06:49:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.131 06:49:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.131 06:49:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.131 06:49:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.131 06:49:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.131 06:49:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.131 06:49:21 -- accel/accel.sh@42 -- # jq -r . 00:06:52.131 [2024-04-27 06:49:21.609530] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:52.131 [2024-04-27 06:49:21.609622] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2612897 ] 00:06:52.131 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.131 [2024-04-27 06:49:21.678369] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.131 [2024-04-27 06:49:21.712831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val= 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val= 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val=0x1 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val= 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val= 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val=dif_verify 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val= 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val=software 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@23 -- # accel_module=software 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val=32 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val=32 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val=1 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val=No 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val= 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.131 06:49:21 -- accel/accel.sh@21 -- # val= 00:06:52.131 06:49:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.131 06:49:21 -- accel/accel.sh@20 -- # read -r var val 00:06:53.069 06:49:22 -- accel/accel.sh@21 -- # val= 00:06:53.069 06:49:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # IFS=: 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # read -r var val 00:06:53.069 06:49:22 -- accel/accel.sh@21 -- # val= 00:06:53.069 06:49:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # IFS=: 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # read -r var val 00:06:53.069 06:49:22 -- accel/accel.sh@21 -- # val= 00:06:53.069 06:49:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # IFS=: 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # read -r var val 00:06:53.069 06:49:22 -- accel/accel.sh@21 -- # val= 00:06:53.069 06:49:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # IFS=: 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # read -r var val 00:06:53.069 06:49:22 -- accel/accel.sh@21 -- # val= 00:06:53.069 06:49:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # IFS=: 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # read -r var val 00:06:53.069 06:49:22 -- accel/accel.sh@21 -- # val= 00:06:53.069 06:49:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # IFS=: 00:06:53.069 06:49:22 -- accel/accel.sh@20 -- # read -r var val 00:06:53.069 06:49:22 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:53.069 06:49:22 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:53.069 06:49:22 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.069 00:06:53.069 real 0m2.573s 00:06:53.069 user 0m2.330s 00:06:53.069 sys 0m0.254s 00:06:53.069 06:49:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.069 06:49:22 -- common/autotest_common.sh@10 -- # set +x 00:06:53.069 ************************************ 00:06:53.069 END TEST accel_dif_verify 00:06:53.069 ************************************ 00:06:53.069 06:49:22 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:53.070 06:49:22 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:53.070 06:49:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:53.070 06:49:22 -- common/autotest_common.sh@10 -- # set +x 00:06:53.070 ************************************ 00:06:53.070 START TEST accel_dif_generate 00:06:53.070 ************************************ 00:06:53.070 06:49:22 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:06:53.070 06:49:22 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.070 06:49:22 -- accel/accel.sh@17 -- # local accel_module 00:06:53.070 06:49:22 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:53.070 06:49:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:53.070 06:49:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.070 06:49:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.070 06:49:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.070 06:49:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.070 06:49:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.070 06:49:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.070 06:49:22 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.070 06:49:22 -- accel/accel.sh@42 -- # jq -r . 00:06:53.070 [2024-04-27 06:49:22.946904] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:53.070 [2024-04-27 06:49:22.947012] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2613186 ] 00:06:53.328 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.329 [2024-04-27 06:49:23.015684] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.329 [2024-04-27 06:49:23.051446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.708 06:49:24 -- accel/accel.sh@18 -- # out=' 00:06:54.708 SPDK Configuration: 00:06:54.708 Core mask: 0x1 00:06:54.708 00:06:54.708 Accel Perf Configuration: 00:06:54.708 Workload Type: dif_generate 00:06:54.708 Vector size: 4096 bytes 00:06:54.708 Transfer size: 4096 bytes 00:06:54.708 Block size: 512 bytes 00:06:54.708 Metadata size: 8 bytes 00:06:54.708 Vector count 1 00:06:54.708 Module: software 00:06:54.708 Queue depth: 32 00:06:54.708 Allocate depth: 32 00:06:54.708 # threads/core: 1 00:06:54.708 Run time: 1 seconds 00:06:54.708 Verify: No 00:06:54.708 00:06:54.708 Running for 1 seconds... 00:06:54.708 00:06:54.708 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:54.708 ------------------------------------------------------------------------------------ 00:06:54.708 0,0 287904/s 1142 MiB/s 0 0 00:06:54.708 ==================================================================================== 00:06:54.708 Total 287904/s 1124 MiB/s 0 0' 00:06:54.708 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.708 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.708 06:49:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:54.708 06:49:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:54.708 06:49:24 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.708 06:49:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.708 06:49:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.708 06:49:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.708 06:49:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.708 06:49:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.708 06:49:24 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.709 06:49:24 -- accel/accel.sh@42 -- # jq -r . 00:06:54.709 [2024-04-27 06:49:24.228991] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:54.709 [2024-04-27 06:49:24.229082] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2613452 ] 00:06:54.709 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.709 [2024-04-27 06:49:24.297106] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.709 [2024-04-27 06:49:24.331382] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val= 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val= 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val=0x1 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val= 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val= 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val=dif_generate 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val= 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val=software 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@23 -- # accel_module=software 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val=32 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val=32 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val=1 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val=No 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val= 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.709 06:49:24 -- accel/accel.sh@21 -- # val= 00:06:54.709 06:49:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.709 06:49:24 -- accel/accel.sh@20 -- # read -r var val 00:06:55.647 06:49:25 -- accel/accel.sh@21 -- # val= 00:06:55.647 06:49:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # IFS=: 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # read -r var val 00:06:55.647 06:49:25 -- accel/accel.sh@21 -- # val= 00:06:55.647 06:49:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # IFS=: 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # read -r var val 00:06:55.647 06:49:25 -- accel/accel.sh@21 -- # val= 00:06:55.647 06:49:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # IFS=: 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # read -r var val 00:06:55.647 06:49:25 -- accel/accel.sh@21 -- # val= 00:06:55.647 06:49:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # IFS=: 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # read -r var val 00:06:55.647 06:49:25 -- accel/accel.sh@21 -- # val= 00:06:55.647 06:49:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # IFS=: 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # read -r var val 00:06:55.647 06:49:25 -- accel/accel.sh@21 -- # val= 00:06:55.647 06:49:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # IFS=: 00:06:55.647 06:49:25 -- accel/accel.sh@20 -- # read -r var val 00:06:55.647 06:49:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:55.647 06:49:25 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:55.647 06:49:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.647 00:06:55.647 real 0m2.571s 00:06:55.647 user 0m2.328s 00:06:55.647 sys 0m0.253s 00:06:55.647 06:49:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:55.647 06:49:25 -- common/autotest_common.sh@10 -- # set +x 00:06:55.647 ************************************ 00:06:55.647 END TEST accel_dif_generate 00:06:55.647 ************************************ 00:06:55.647 06:49:25 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:55.647 06:49:25 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:55.647 06:49:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:55.647 06:49:25 -- common/autotest_common.sh@10 -- # set +x 00:06:55.906 ************************************ 00:06:55.906 START TEST accel_dif_generate_copy 00:06:55.906 ************************************ 00:06:55.906 06:49:25 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:06:55.906 06:49:25 -- accel/accel.sh@16 -- # local accel_opc 00:06:55.906 06:49:25 -- accel/accel.sh@17 -- # local accel_module 00:06:55.906 06:49:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:55.906 06:49:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:55.906 06:49:25 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.906 06:49:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.906 06:49:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.906 06:49:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.906 06:49:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.906 06:49:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.906 06:49:25 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.906 06:49:25 -- accel/accel.sh@42 -- # jq -r . 00:06:55.906 [2024-04-27 06:49:25.567356] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:55.907 [2024-04-27 06:49:25.567446] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2613727 ] 00:06:55.907 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.907 [2024-04-27 06:49:25.636972] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.907 [2024-04-27 06:49:25.672206] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.285 06:49:26 -- accel/accel.sh@18 -- # out=' 00:06:57.285 SPDK Configuration: 00:06:57.285 Core mask: 0x1 00:06:57.285 00:06:57.285 Accel Perf Configuration: 00:06:57.285 Workload Type: dif_generate_copy 00:06:57.285 Vector size: 4096 bytes 00:06:57.285 Transfer size: 4096 bytes 00:06:57.285 Vector count 1 00:06:57.285 Module: software 00:06:57.285 Queue depth: 32 00:06:57.285 Allocate depth: 32 00:06:57.285 # threads/core: 1 00:06:57.285 Run time: 1 seconds 00:06:57.285 Verify: No 00:06:57.285 00:06:57.285 Running for 1 seconds... 00:06:57.285 00:06:57.285 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:57.285 ------------------------------------------------------------------------------------ 00:06:57.285 0,0 220608/s 875 MiB/s 0 0 00:06:57.285 ==================================================================================== 00:06:57.285 Total 220608/s 861 MiB/s 0 0' 00:06:57.285 06:49:26 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:26 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:57.285 06:49:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:57.285 06:49:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.285 06:49:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.285 06:49:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.285 06:49:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.285 06:49:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.285 06:49:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.285 06:49:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.285 06:49:26 -- accel/accel.sh@42 -- # jq -r . 00:06:57.285 [2024-04-27 06:49:26.853275] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:57.285 [2024-04-27 06:49:26.853366] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2613874 ] 00:06:57.285 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.285 [2024-04-27 06:49:26.925256] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.285 [2024-04-27 06:49:26.960561] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.285 06:49:26 -- accel/accel.sh@21 -- # val= 00:06:57.285 06:49:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:26 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:27 -- accel/accel.sh@21 -- # val= 00:06:57.285 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:27 -- accel/accel.sh@21 -- # val=0x1 00:06:57.285 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:27 -- accel/accel.sh@21 -- # val= 00:06:57.285 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:27 -- accel/accel.sh@21 -- # val= 00:06:57.285 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:27 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:57.285 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:27 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:57.285 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:57.285 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.285 06:49:27 -- accel/accel.sh@21 -- # val= 00:06:57.285 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.285 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.286 06:49:27 -- accel/accel.sh@21 -- # val=software 00:06:57.286 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.286 06:49:27 -- accel/accel.sh@23 -- # accel_module=software 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.286 06:49:27 -- accel/accel.sh@21 -- # val=32 00:06:57.286 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.286 06:49:27 -- accel/accel.sh@21 -- # val=32 00:06:57.286 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.286 06:49:27 -- accel/accel.sh@21 -- # val=1 00:06:57.286 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.286 06:49:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:57.286 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.286 06:49:27 -- accel/accel.sh@21 -- # val=No 00:06:57.286 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.286 06:49:27 -- accel/accel.sh@21 -- # val= 00:06:57.286 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.286 06:49:27 -- accel/accel.sh@21 -- # val= 00:06:57.286 06:49:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # IFS=: 00:06:57.286 06:49:27 -- accel/accel.sh@20 -- # read -r var val 00:06:58.224 06:49:28 -- accel/accel.sh@21 -- # val= 00:06:58.224 06:49:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.224 06:49:28 -- accel/accel.sh@20 -- # IFS=: 00:06:58.483 06:49:28 -- accel/accel.sh@20 -- # read -r var val 00:06:58.483 06:49:28 -- accel/accel.sh@21 -- # val= 00:06:58.483 06:49:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.483 06:49:28 -- accel/accel.sh@20 -- # IFS=: 00:06:58.483 06:49:28 -- accel/accel.sh@20 -- # read -r var val 00:06:58.483 06:49:28 -- accel/accel.sh@21 -- # val= 00:06:58.483 06:49:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.483 06:49:28 -- accel/accel.sh@20 -- # IFS=: 00:06:58.483 06:49:28 -- accel/accel.sh@20 -- # read -r var val 00:06:58.483 06:49:28 -- accel/accel.sh@21 -- # val= 00:06:58.483 06:49:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.483 06:49:28 -- accel/accel.sh@20 -- # IFS=: 00:06:58.483 06:49:28 -- accel/accel.sh@20 -- # read -r var val 00:06:58.483 06:49:28 -- accel/accel.sh@21 -- # val= 00:06:58.483 06:49:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.484 06:49:28 -- accel/accel.sh@20 -- # IFS=: 00:06:58.484 06:49:28 -- accel/accel.sh@20 -- # read -r var val 00:06:58.484 06:49:28 -- accel/accel.sh@21 -- # val= 00:06:58.484 06:49:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.484 06:49:28 -- accel/accel.sh@20 -- # IFS=: 00:06:58.484 06:49:28 -- accel/accel.sh@20 -- # read -r var val 00:06:58.484 06:49:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:58.484 06:49:28 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:58.484 06:49:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.484 00:06:58.484 real 0m2.584s 00:06:58.484 user 0m2.352s 00:06:58.484 sys 0m0.241s 00:06:58.484 06:49:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.484 06:49:28 -- common/autotest_common.sh@10 -- # set +x 00:06:58.484 ************************************ 00:06:58.484 END TEST accel_dif_generate_copy 00:06:58.484 ************************************ 00:06:58.484 06:49:28 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:58.484 06:49:28 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:58.484 06:49:28 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:58.484 06:49:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:58.484 06:49:28 -- common/autotest_common.sh@10 -- # set +x 00:06:58.484 ************************************ 00:06:58.484 START TEST accel_comp 00:06:58.484 ************************************ 00:06:58.484 06:49:28 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:58.484 06:49:28 -- accel/accel.sh@16 -- # local accel_opc 00:06:58.484 06:49:28 -- accel/accel.sh@17 -- # local accel_module 00:06:58.484 06:49:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:58.484 06:49:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:58.484 06:49:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.484 06:49:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.484 06:49:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.484 06:49:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.484 06:49:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.484 06:49:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.484 06:49:28 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.484 06:49:28 -- accel/accel.sh@42 -- # jq -r . 00:06:58.484 [2024-04-27 06:49:28.200261] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:58.484 [2024-04-27 06:49:28.200352] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2614066 ] 00:06:58.484 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.484 [2024-04-27 06:49:28.270402] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.484 [2024-04-27 06:49:28.306366] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.864 06:49:29 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:59.864 00:06:59.864 SPDK Configuration: 00:06:59.864 Core mask: 0x1 00:06:59.864 00:06:59.864 Accel Perf Configuration: 00:06:59.864 Workload Type: compress 00:06:59.864 Transfer size: 4096 bytes 00:06:59.864 Vector count 1 00:06:59.865 Module: software 00:06:59.865 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:59.865 Queue depth: 32 00:06:59.865 Allocate depth: 32 00:06:59.865 # threads/core: 1 00:06:59.865 Run time: 1 seconds 00:06:59.865 Verify: No 00:06:59.865 00:06:59.865 Running for 1 seconds... 00:06:59.865 00:06:59.865 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:59.865 ------------------------------------------------------------------------------------ 00:06:59.865 0,0 67968/s 283 MiB/s 0 0 00:06:59.865 ==================================================================================== 00:06:59.865 Total 67968/s 265 MiB/s 0 0' 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:59.865 06:49:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:59.865 06:49:29 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.865 06:49:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.865 06:49:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.865 06:49:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.865 06:49:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.865 06:49:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.865 06:49:29 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.865 06:49:29 -- accel/accel.sh@42 -- # jq -r . 00:06:59.865 [2024-04-27 06:49:29.487637] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:59.865 [2024-04-27 06:49:29.487747] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2614315 ] 00:06:59.865 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.865 [2024-04-27 06:49:29.555857] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.865 [2024-04-27 06:49:29.590234] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val= 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val= 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val= 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val=0x1 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val= 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val= 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val=compress 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val= 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val=software 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@23 -- # accel_module=software 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val=32 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val=32 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val=1 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val=No 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val= 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:06:59.865 06:49:29 -- accel/accel.sh@21 -- # val= 00:06:59.865 06:49:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # IFS=: 00:06:59.865 06:49:29 -- accel/accel.sh@20 -- # read -r var val 00:07:01.245 06:49:30 -- accel/accel.sh@21 -- # val= 00:07:01.245 06:49:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.245 06:49:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.245 06:49:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.245 06:49:30 -- accel/accel.sh@21 -- # val= 00:07:01.245 06:49:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.245 06:49:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.245 06:49:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.245 06:49:30 -- accel/accel.sh@21 -- # val= 00:07:01.245 06:49:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.245 06:49:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.245 06:49:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.245 06:49:30 -- accel/accel.sh@21 -- # val= 00:07:01.246 06:49:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.246 06:49:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.246 06:49:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.246 06:49:30 -- accel/accel.sh@21 -- # val= 00:07:01.246 06:49:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.246 06:49:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.246 06:49:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.246 06:49:30 -- accel/accel.sh@21 -- # val= 00:07:01.246 06:49:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.246 06:49:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.246 06:49:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.246 06:49:30 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:01.246 06:49:30 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:01.246 06:49:30 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.246 00:07:01.246 real 0m2.578s 00:07:01.246 user 0m2.328s 00:07:01.246 sys 0m0.260s 00:07:01.246 06:49:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.246 06:49:30 -- common/autotest_common.sh@10 -- # set +x 00:07:01.246 ************************************ 00:07:01.246 END TEST accel_comp 00:07:01.246 ************************************ 00:07:01.246 06:49:30 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:01.246 06:49:30 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:01.246 06:49:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:01.246 06:49:30 -- common/autotest_common.sh@10 -- # set +x 00:07:01.246 ************************************ 00:07:01.246 START TEST accel_decomp 00:07:01.246 ************************************ 00:07:01.246 06:49:30 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:01.246 06:49:30 -- accel/accel.sh@16 -- # local accel_opc 00:07:01.246 06:49:30 -- accel/accel.sh@17 -- # local accel_module 00:07:01.246 06:49:30 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:01.246 06:49:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:01.246 06:49:30 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.246 06:49:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.246 06:49:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.246 06:49:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.246 06:49:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.246 06:49:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.246 06:49:30 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.246 06:49:30 -- accel/accel.sh@42 -- # jq -r . 00:07:01.246 [2024-04-27 06:49:30.824758] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:01.246 [2024-04-27 06:49:30.824846] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2614596 ] 00:07:01.246 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.246 [2024-04-27 06:49:30.896051] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.246 [2024-04-27 06:49:30.931339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.623 06:49:32 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:02.623 00:07:02.623 SPDK Configuration: 00:07:02.623 Core mask: 0x1 00:07:02.623 00:07:02.623 Accel Perf Configuration: 00:07:02.623 Workload Type: decompress 00:07:02.623 Transfer size: 4096 bytes 00:07:02.623 Vector count 1 00:07:02.623 Module: software 00:07:02.623 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:02.623 Queue depth: 32 00:07:02.623 Allocate depth: 32 00:07:02.623 # threads/core: 1 00:07:02.623 Run time: 1 seconds 00:07:02.623 Verify: Yes 00:07:02.623 00:07:02.623 Running for 1 seconds... 00:07:02.623 00:07:02.623 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:02.623 ------------------------------------------------------------------------------------ 00:07:02.623 0,0 91136/s 167 MiB/s 0 0 00:07:02.623 ==================================================================================== 00:07:02.623 Total 91136/s 356 MiB/s 0 0' 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:02.623 06:49:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:02.623 06:49:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.623 06:49:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.623 06:49:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.623 06:49:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.623 06:49:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.623 06:49:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.623 06:49:32 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.623 06:49:32 -- accel/accel.sh@42 -- # jq -r . 00:07:02.623 [2024-04-27 06:49:32.113601] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:02.623 [2024-04-27 06:49:32.113693] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2614870 ] 00:07:02.623 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.623 [2024-04-27 06:49:32.184803] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.623 [2024-04-27 06:49:32.219375] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val= 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val= 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val= 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val=0x1 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val= 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val= 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val=decompress 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val= 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val=software 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@23 -- # accel_module=software 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val=32 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val=32 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val=1 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val=Yes 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val= 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:02.623 06:49:32 -- accel/accel.sh@21 -- # val= 00:07:02.623 06:49:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # IFS=: 00:07:02.623 06:49:32 -- accel/accel.sh@20 -- # read -r var val 00:07:03.560 06:49:33 -- accel/accel.sh@21 -- # val= 00:07:03.560 06:49:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # IFS=: 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # read -r var val 00:07:03.560 06:49:33 -- accel/accel.sh@21 -- # val= 00:07:03.560 06:49:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # IFS=: 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # read -r var val 00:07:03.560 06:49:33 -- accel/accel.sh@21 -- # val= 00:07:03.560 06:49:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # IFS=: 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # read -r var val 00:07:03.560 06:49:33 -- accel/accel.sh@21 -- # val= 00:07:03.560 06:49:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # IFS=: 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # read -r var val 00:07:03.560 06:49:33 -- accel/accel.sh@21 -- # val= 00:07:03.560 06:49:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # IFS=: 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # read -r var val 00:07:03.560 06:49:33 -- accel/accel.sh@21 -- # val= 00:07:03.560 06:49:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # IFS=: 00:07:03.560 06:49:33 -- accel/accel.sh@20 -- # read -r var val 00:07:03.560 06:49:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:03.560 06:49:33 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:03.560 06:49:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:03.560 00:07:03.560 real 0m2.585s 00:07:03.560 user 0m2.345s 00:07:03.560 sys 0m0.249s 00:07:03.560 06:49:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:03.560 06:49:33 -- common/autotest_common.sh@10 -- # set +x 00:07:03.560 ************************************ 00:07:03.560 END TEST accel_decomp 00:07:03.560 ************************************ 00:07:03.560 06:49:33 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:03.560 06:49:33 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:03.560 06:49:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:03.560 06:49:33 -- common/autotest_common.sh@10 -- # set +x 00:07:03.561 ************************************ 00:07:03.561 START TEST accel_decmop_full 00:07:03.561 ************************************ 00:07:03.561 06:49:33 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:03.561 06:49:33 -- accel/accel.sh@16 -- # local accel_opc 00:07:03.561 06:49:33 -- accel/accel.sh@17 -- # local accel_module 00:07:03.561 06:49:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:03.561 06:49:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:03.561 06:49:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.561 06:49:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.561 06:49:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.561 06:49:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.561 06:49:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.561 06:49:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.561 06:49:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.561 06:49:33 -- accel/accel.sh@42 -- # jq -r . 00:07:03.561 [2024-04-27 06:49:33.456223] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:03.561 [2024-04-27 06:49:33.456329] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2615151 ] 00:07:03.820 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.820 [2024-04-27 06:49:33.525047] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.820 [2024-04-27 06:49:33.560924] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.263 06:49:34 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:05.263 00:07:05.263 SPDK Configuration: 00:07:05.263 Core mask: 0x1 00:07:05.263 00:07:05.263 Accel Perf Configuration: 00:07:05.263 Workload Type: decompress 00:07:05.263 Transfer size: 111250 bytes 00:07:05.263 Vector count 1 00:07:05.263 Module: software 00:07:05.263 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.263 Queue depth: 32 00:07:05.263 Allocate depth: 32 00:07:05.263 # threads/core: 1 00:07:05.263 Run time: 1 seconds 00:07:05.263 Verify: Yes 00:07:05.263 00:07:05.263 Running for 1 seconds... 00:07:05.263 00:07:05.263 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:05.263 ------------------------------------------------------------------------------------ 00:07:05.263 0,0 5888/s 243 MiB/s 0 0 00:07:05.263 ==================================================================================== 00:07:05.263 Total 5888/s 624 MiB/s 0 0' 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:05.263 06:49:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:05.263 06:49:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.263 06:49:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.263 06:49:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.263 06:49:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.263 06:49:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.263 06:49:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.263 06:49:34 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.263 06:49:34 -- accel/accel.sh@42 -- # jq -r . 00:07:05.263 [2024-04-27 06:49:34.753921] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:05.263 [2024-04-27 06:49:34.754012] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2615357 ] 00:07:05.263 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.263 [2024-04-27 06:49:34.825055] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.263 [2024-04-27 06:49:34.861638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val= 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val= 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val= 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val=0x1 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val= 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val= 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val=decompress 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val= 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val=software 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@23 -- # accel_module=software 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val=32 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val=32 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val=1 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:05.263 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.263 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.263 06:49:34 -- accel/accel.sh@21 -- # val=Yes 00:07:05.264 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.264 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.264 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.264 06:49:34 -- accel/accel.sh@21 -- # val= 00:07:05.264 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.264 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.264 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.264 06:49:34 -- accel/accel.sh@21 -- # val= 00:07:05.264 06:49:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.264 06:49:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.264 06:49:34 -- accel/accel.sh@20 -- # read -r var val 00:07:06.222 06:49:36 -- accel/accel.sh@21 -- # val= 00:07:06.222 06:49:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # IFS=: 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # read -r var val 00:07:06.222 06:49:36 -- accel/accel.sh@21 -- # val= 00:07:06.222 06:49:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # IFS=: 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # read -r var val 00:07:06.222 06:49:36 -- accel/accel.sh@21 -- # val= 00:07:06.222 06:49:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # IFS=: 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # read -r var val 00:07:06.222 06:49:36 -- accel/accel.sh@21 -- # val= 00:07:06.222 06:49:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # IFS=: 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # read -r var val 00:07:06.222 06:49:36 -- accel/accel.sh@21 -- # val= 00:07:06.222 06:49:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # IFS=: 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # read -r var val 00:07:06.222 06:49:36 -- accel/accel.sh@21 -- # val= 00:07:06.222 06:49:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # IFS=: 00:07:06.222 06:49:36 -- accel/accel.sh@20 -- # read -r var val 00:07:06.222 06:49:36 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:06.222 06:49:36 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:06.222 06:49:36 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:06.222 00:07:06.222 real 0m2.604s 00:07:06.222 user 0m2.347s 00:07:06.222 sys 0m0.267s 00:07:06.222 06:49:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.222 06:49:36 -- common/autotest_common.sh@10 -- # set +x 00:07:06.222 ************************************ 00:07:06.222 END TEST accel_decmop_full 00:07:06.222 ************************************ 00:07:06.222 06:49:36 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:06.222 06:49:36 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:06.222 06:49:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:06.222 06:49:36 -- common/autotest_common.sh@10 -- # set +x 00:07:06.222 ************************************ 00:07:06.222 START TEST accel_decomp_mcore 00:07:06.222 ************************************ 00:07:06.222 06:49:36 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:06.222 06:49:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:06.222 06:49:36 -- accel/accel.sh@17 -- # local accel_module 00:07:06.222 06:49:36 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:06.222 06:49:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:06.222 06:49:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.222 06:49:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.222 06:49:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.222 06:49:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.222 06:49:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.222 06:49:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.222 06:49:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.222 06:49:36 -- accel/accel.sh@42 -- # jq -r . 00:07:06.222 [2024-04-27 06:49:36.108050] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:06.222 [2024-04-27 06:49:36.108142] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2615549 ] 00:07:06.481 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.481 [2024-04-27 06:49:36.179104] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:06.481 [2024-04-27 06:49:36.217313] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:06.481 [2024-04-27 06:49:36.217417] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:06.481 [2024-04-27 06:49:36.217493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:06.481 [2024-04-27 06:49:36.217496] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.859 06:49:37 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:07.859 00:07:07.859 SPDK Configuration: 00:07:07.859 Core mask: 0xf 00:07:07.859 00:07:07.859 Accel Perf Configuration: 00:07:07.859 Workload Type: decompress 00:07:07.859 Transfer size: 4096 bytes 00:07:07.859 Vector count 1 00:07:07.859 Module: software 00:07:07.860 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.860 Queue depth: 32 00:07:07.860 Allocate depth: 32 00:07:07.860 # threads/core: 1 00:07:07.860 Run time: 1 seconds 00:07:07.860 Verify: Yes 00:07:07.860 00:07:07.860 Running for 1 seconds... 00:07:07.860 00:07:07.860 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:07.860 ------------------------------------------------------------------------------------ 00:07:07.860 0,0 78048/s 143 MiB/s 0 0 00:07:07.860 3,0 78592/s 144 MiB/s 0 0 00:07:07.860 2,0 78144/s 143 MiB/s 0 0 00:07:07.860 1,0 78432/s 144 MiB/s 0 0 00:07:07.860 ==================================================================================== 00:07:07.860 Total 313216/s 1223 MiB/s 0 0' 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:07.860 06:49:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:07.860 06:49:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.860 06:49:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.860 06:49:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.860 06:49:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.860 06:49:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.860 06:49:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.860 06:49:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.860 06:49:37 -- accel/accel.sh@42 -- # jq -r . 00:07:07.860 [2024-04-27 06:49:37.405200] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:07.860 [2024-04-27 06:49:37.405290] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2615737 ] 00:07:07.860 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.860 [2024-04-27 06:49:37.474622] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:07.860 [2024-04-27 06:49:37.512588] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:07.860 [2024-04-27 06:49:37.512680] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:07.860 [2024-04-27 06:49:37.512741] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:07.860 [2024-04-27 06:49:37.512743] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val= 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val= 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val= 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val=0xf 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val= 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val= 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val=decompress 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val= 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val=software 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@23 -- # accel_module=software 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val=32 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val=32 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val=1 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val=Yes 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val= 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:07.860 06:49:37 -- accel/accel.sh@21 -- # val= 00:07:07.860 06:49:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # IFS=: 00:07:07.860 06:49:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@21 -- # val= 00:07:08.798 06:49:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # IFS=: 00:07:08.798 06:49:38 -- accel/accel.sh@20 -- # read -r var val 00:07:08.798 06:49:38 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:08.798 06:49:38 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:08.798 06:49:38 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.798 00:07:08.798 real 0m2.604s 00:07:08.798 user 0m8.978s 00:07:08.798 sys 0m0.291s 00:07:08.798 06:49:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.799 06:49:38 -- common/autotest_common.sh@10 -- # set +x 00:07:08.799 ************************************ 00:07:08.799 END TEST accel_decomp_mcore 00:07:08.799 ************************************ 00:07:09.058 06:49:38 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:09.058 06:49:38 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:09.058 06:49:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.058 06:49:38 -- common/autotest_common.sh@10 -- # set +x 00:07:09.058 ************************************ 00:07:09.058 START TEST accel_decomp_full_mcore 00:07:09.058 ************************************ 00:07:09.058 06:49:38 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:09.058 06:49:38 -- accel/accel.sh@16 -- # local accel_opc 00:07:09.058 06:49:38 -- accel/accel.sh@17 -- # local accel_module 00:07:09.058 06:49:38 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:09.058 06:49:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:09.058 06:49:38 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.058 06:49:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.058 06:49:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.058 06:49:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.058 06:49:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.058 06:49:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.058 06:49:38 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.058 06:49:38 -- accel/accel.sh@42 -- # jq -r . 00:07:09.058 [2024-04-27 06:49:38.755455] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:09.058 [2024-04-27 06:49:38.755553] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2616023 ] 00:07:09.058 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.058 [2024-04-27 06:49:38.823772] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:09.058 [2024-04-27 06:49:38.860915] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.058 [2024-04-27 06:49:38.861011] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:09.058 [2024-04-27 06:49:38.861085] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:09.058 [2024-04-27 06:49:38.861087] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.439 06:49:40 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:10.439 00:07:10.439 SPDK Configuration: 00:07:10.439 Core mask: 0xf 00:07:10.439 00:07:10.439 Accel Perf Configuration: 00:07:10.439 Workload Type: decompress 00:07:10.439 Transfer size: 111250 bytes 00:07:10.439 Vector count 1 00:07:10.439 Module: software 00:07:10.439 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.439 Queue depth: 32 00:07:10.439 Allocate depth: 32 00:07:10.439 # threads/core: 1 00:07:10.439 Run time: 1 seconds 00:07:10.439 Verify: Yes 00:07:10.439 00:07:10.439 Running for 1 seconds... 00:07:10.439 00:07:10.439 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:10.439 ------------------------------------------------------------------------------------ 00:07:10.439 0,0 5792/s 239 MiB/s 0 0 00:07:10.439 3,0 5824/s 240 MiB/s 0 0 00:07:10.439 2,0 5824/s 240 MiB/s 0 0 00:07:10.439 1,0 5824/s 240 MiB/s 0 0 00:07:10.439 ==================================================================================== 00:07:10.439 Total 23264/s 2468 MiB/s 0 0' 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:10.439 06:49:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:10.439 06:49:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.439 06:49:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.439 06:49:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.439 06:49:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.439 06:49:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.439 06:49:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.439 06:49:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.439 06:49:40 -- accel/accel.sh@42 -- # jq -r . 00:07:10.439 [2024-04-27 06:49:40.062445] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:10.439 [2024-04-27 06:49:40.062535] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2616292 ] 00:07:10.439 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.439 [2024-04-27 06:49:40.144364] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:10.439 [2024-04-27 06:49:40.183695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:10.439 [2024-04-27 06:49:40.183797] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:10.439 [2024-04-27 06:49:40.183877] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:10.439 [2024-04-27 06:49:40.183879] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val= 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val= 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val= 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val=0xf 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val= 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val= 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val=decompress 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val= 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val=software 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@23 -- # accel_module=software 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val=32 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val=32 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val=1 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val=Yes 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val= 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:10.439 06:49:40 -- accel/accel.sh@21 -- # val= 00:07:10.439 06:49:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # IFS=: 00:07:10.439 06:49:40 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@21 -- # val= 00:07:11.822 06:49:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # IFS=: 00:07:11.822 06:49:41 -- accel/accel.sh@20 -- # read -r var val 00:07:11.822 06:49:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:11.822 06:49:41 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:11.822 06:49:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:11.822 00:07:11.822 real 0m2.635s 00:07:11.822 user 0m9.067s 00:07:11.822 sys 0m0.278s 00:07:11.822 06:49:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.823 06:49:41 -- common/autotest_common.sh@10 -- # set +x 00:07:11.823 ************************************ 00:07:11.823 END TEST accel_decomp_full_mcore 00:07:11.823 ************************************ 00:07:11.823 06:49:41 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:11.823 06:49:41 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:11.823 06:49:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:11.823 06:49:41 -- common/autotest_common.sh@10 -- # set +x 00:07:11.823 ************************************ 00:07:11.823 START TEST accel_decomp_mthread 00:07:11.823 ************************************ 00:07:11.823 06:49:41 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:11.823 06:49:41 -- accel/accel.sh@16 -- # local accel_opc 00:07:11.823 06:49:41 -- accel/accel.sh@17 -- # local accel_module 00:07:11.823 06:49:41 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:11.823 06:49:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:11.823 06:49:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.823 06:49:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.823 06:49:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.823 06:49:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.823 06:49:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.823 06:49:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.823 06:49:41 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.823 06:49:41 -- accel/accel.sh@42 -- # jq -r . 00:07:11.823 [2024-04-27 06:49:41.439441] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:11.823 [2024-04-27 06:49:41.439533] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2616584 ] 00:07:11.823 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.823 [2024-04-27 06:49:41.509452] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.823 [2024-04-27 06:49:41.545088] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.203 06:49:42 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:13.203 00:07:13.203 SPDK Configuration: 00:07:13.203 Core mask: 0x1 00:07:13.203 00:07:13.203 Accel Perf Configuration: 00:07:13.203 Workload Type: decompress 00:07:13.203 Transfer size: 4096 bytes 00:07:13.203 Vector count 1 00:07:13.203 Module: software 00:07:13.203 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.203 Queue depth: 32 00:07:13.203 Allocate depth: 32 00:07:13.203 # threads/core: 2 00:07:13.203 Run time: 1 seconds 00:07:13.203 Verify: Yes 00:07:13.203 00:07:13.203 Running for 1 seconds... 00:07:13.203 00:07:13.203 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:13.203 ------------------------------------------------------------------------------------ 00:07:13.203 0,1 48032/s 88 MiB/s 0 0 00:07:13.203 0,0 47904/s 88 MiB/s 0 0 00:07:13.203 ==================================================================================== 00:07:13.203 Total 95936/s 374 MiB/s 0 0' 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.203 06:49:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.203 06:49:42 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.203 06:49:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.203 06:49:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.203 06:49:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.203 06:49:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.203 06:49:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.203 06:49:42 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.203 06:49:42 -- accel/accel.sh@42 -- # jq -r . 00:07:13.203 [2024-04-27 06:49:42.729115] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:13.203 [2024-04-27 06:49:42.729202] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2616852 ] 00:07:13.203 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.203 [2024-04-27 06:49:42.798885] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.203 [2024-04-27 06:49:42.833534] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val= 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val= 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val= 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val=0x1 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val= 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val= 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val=decompress 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val= 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val=software 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@23 -- # accel_module=software 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val=32 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val=32 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val=2 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val=Yes 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val= 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.203 06:49:42 -- accel/accel.sh@21 -- # val= 00:07:13.203 06:49:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.203 06:49:42 -- accel/accel.sh@20 -- # read -r var val 00:07:14.140 06:49:43 -- accel/accel.sh@21 -- # val= 00:07:14.140 06:49:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # IFS=: 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # read -r var val 00:07:14.140 06:49:43 -- accel/accel.sh@21 -- # val= 00:07:14.140 06:49:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # IFS=: 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # read -r var val 00:07:14.140 06:49:43 -- accel/accel.sh@21 -- # val= 00:07:14.140 06:49:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # IFS=: 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # read -r var val 00:07:14.140 06:49:43 -- accel/accel.sh@21 -- # val= 00:07:14.140 06:49:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # IFS=: 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # read -r var val 00:07:14.140 06:49:43 -- accel/accel.sh@21 -- # val= 00:07:14.140 06:49:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # IFS=: 00:07:14.140 06:49:43 -- accel/accel.sh@20 -- # read -r var val 00:07:14.140 06:49:44 -- accel/accel.sh@21 -- # val= 00:07:14.140 06:49:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.140 06:49:44 -- accel/accel.sh@20 -- # IFS=: 00:07:14.140 06:49:44 -- accel/accel.sh@20 -- # read -r var val 00:07:14.140 06:49:44 -- accel/accel.sh@21 -- # val= 00:07:14.140 06:49:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.140 06:49:44 -- accel/accel.sh@20 -- # IFS=: 00:07:14.140 06:49:44 -- accel/accel.sh@20 -- # read -r var val 00:07:14.140 06:49:44 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:14.140 06:49:44 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:14.140 06:49:44 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.140 00:07:14.140 real 0m2.585s 00:07:14.140 user 0m2.330s 00:07:14.140 sys 0m0.265s 00:07:14.140 06:49:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.140 06:49:44 -- common/autotest_common.sh@10 -- # set +x 00:07:14.140 ************************************ 00:07:14.140 END TEST accel_decomp_mthread 00:07:14.140 ************************************ 00:07:14.398 06:49:44 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.398 06:49:44 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:14.398 06:49:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:14.398 06:49:44 -- common/autotest_common.sh@10 -- # set +x 00:07:14.398 ************************************ 00:07:14.398 START TEST accel_deomp_full_mthread 00:07:14.398 ************************************ 00:07:14.398 06:49:44 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.398 06:49:44 -- accel/accel.sh@16 -- # local accel_opc 00:07:14.398 06:49:44 -- accel/accel.sh@17 -- # local accel_module 00:07:14.398 06:49:44 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.398 06:49:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.398 06:49:44 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.398 06:49:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.398 06:49:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.398 06:49:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.398 06:49:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.398 06:49:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.398 06:49:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.398 06:49:44 -- accel/accel.sh@42 -- # jq -r . 00:07:14.398 [2024-04-27 06:49:44.075156] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:14.398 [2024-04-27 06:49:44.075246] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2617050 ] 00:07:14.398 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.398 [2024-04-27 06:49:44.146022] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.398 [2024-04-27 06:49:44.181309] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.776 06:49:45 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:15.776 00:07:15.776 SPDK Configuration: 00:07:15.776 Core mask: 0x1 00:07:15.776 00:07:15.776 Accel Perf Configuration: 00:07:15.776 Workload Type: decompress 00:07:15.776 Transfer size: 111250 bytes 00:07:15.776 Vector count 1 00:07:15.776 Module: software 00:07:15.776 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.776 Queue depth: 32 00:07:15.776 Allocate depth: 32 00:07:15.776 # threads/core: 2 00:07:15.776 Run time: 1 seconds 00:07:15.776 Verify: Yes 00:07:15.776 00:07:15.776 Running for 1 seconds... 00:07:15.776 00:07:15.776 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:15.776 ------------------------------------------------------------------------------------ 00:07:15.776 0,1 3040/s 125 MiB/s 0 0 00:07:15.776 0,0 3008/s 124 MiB/s 0 0 00:07:15.776 ==================================================================================== 00:07:15.776 Total 6048/s 641 MiB/s 0 0' 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:15.776 06:49:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:15.776 06:49:45 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.776 06:49:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.776 06:49:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.776 06:49:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.776 06:49:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.776 06:49:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.776 06:49:45 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.776 06:49:45 -- accel/accel.sh@42 -- # jq -r . 00:07:15.776 [2024-04-27 06:49:45.385619] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:15.776 [2024-04-27 06:49:45.385747] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2617203 ] 00:07:15.776 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.776 [2024-04-27 06:49:45.457570] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.776 [2024-04-27 06:49:45.493434] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val= 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val= 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val= 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val=0x1 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val= 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val= 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val=decompress 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val= 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val=software 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@23 -- # accel_module=software 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val=32 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val=32 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val=2 00:07:15.776 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.776 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.776 06:49:45 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:15.777 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.777 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.777 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.777 06:49:45 -- accel/accel.sh@21 -- # val=Yes 00:07:15.777 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.777 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.777 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.777 06:49:45 -- accel/accel.sh@21 -- # val= 00:07:15.777 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.777 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.777 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:15.777 06:49:45 -- accel/accel.sh@21 -- # val= 00:07:15.777 06:49:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.777 06:49:45 -- accel/accel.sh@20 -- # IFS=: 00:07:15.777 06:49:45 -- accel/accel.sh@20 -- # read -r var val 00:07:17.154 06:49:46 -- accel/accel.sh@21 -- # val= 00:07:17.154 06:49:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.154 06:49:46 -- accel/accel.sh@21 -- # val= 00:07:17.154 06:49:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.154 06:49:46 -- accel/accel.sh@21 -- # val= 00:07:17.154 06:49:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.154 06:49:46 -- accel/accel.sh@21 -- # val= 00:07:17.154 06:49:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.154 06:49:46 -- accel/accel.sh@21 -- # val= 00:07:17.154 06:49:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.154 06:49:46 -- accel/accel.sh@21 -- # val= 00:07:17.154 06:49:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.154 06:49:46 -- accel/accel.sh@21 -- # val= 00:07:17.154 06:49:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.154 06:49:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.154 06:49:46 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:17.154 06:49:46 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:17.154 06:49:46 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.154 00:07:17.154 real 0m2.629s 00:07:17.154 user 0m2.378s 00:07:17.154 sys 0m0.260s 00:07:17.154 06:49:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.154 06:49:46 -- common/autotest_common.sh@10 -- # set +x 00:07:17.154 ************************************ 00:07:17.154 END TEST accel_deomp_full_mthread 00:07:17.154 ************************************ 00:07:17.154 06:49:46 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:17.154 06:49:46 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:17.154 06:49:46 -- accel/accel.sh@129 -- # build_accel_config 00:07:17.154 06:49:46 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:17.154 06:49:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.154 06:49:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.154 06:49:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.154 06:49:46 -- common/autotest_common.sh@10 -- # set +x 00:07:17.154 06:49:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.154 06:49:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.154 06:49:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.154 06:49:46 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.154 06:49:46 -- accel/accel.sh@42 -- # jq -r . 00:07:17.154 ************************************ 00:07:17.154 START TEST accel_dif_functional_tests 00:07:17.154 ************************************ 00:07:17.154 06:49:46 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:17.154 [2024-04-27 06:49:46.755673] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:17.154 [2024-04-27 06:49:46.755761] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2617451 ] 00:07:17.154 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.154 [2024-04-27 06:49:46.826621] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:17.154 [2024-04-27 06:49:46.863507] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.154 [2024-04-27 06:49:46.863601] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:17.154 [2024-04-27 06:49:46.863603] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.154 00:07:17.155 00:07:17.155 CUnit - A unit testing framework for C - Version 2.1-3 00:07:17.155 http://cunit.sourceforge.net/ 00:07:17.155 00:07:17.155 00:07:17.155 Suite: accel_dif 00:07:17.155 Test: verify: DIF generated, GUARD check ...passed 00:07:17.155 Test: verify: DIF generated, APPTAG check ...passed 00:07:17.155 Test: verify: DIF generated, REFTAG check ...passed 00:07:17.155 Test: verify: DIF not generated, GUARD check ...[2024-04-27 06:49:46.924683] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:17.155 [2024-04-27 06:49:46.924735] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:17.155 passed 00:07:17.155 Test: verify: DIF not generated, APPTAG check ...[2024-04-27 06:49:46.924789] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:17.155 [2024-04-27 06:49:46.924808] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:17.155 passed 00:07:17.155 Test: verify: DIF not generated, REFTAG check ...[2024-04-27 06:49:46.924828] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:17.155 [2024-04-27 06:49:46.924847] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:17.155 passed 00:07:17.155 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:17.155 Test: verify: APPTAG incorrect, APPTAG check ...[2024-04-27 06:49:46.924890] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:17.155 passed 00:07:17.155 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:17.155 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:17.155 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:17.155 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-04-27 06:49:46.924988] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:17.155 passed 00:07:17.155 Test: generate copy: DIF generated, GUARD check ...passed 00:07:17.155 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:17.155 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:17.155 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:17.155 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:17.155 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:17.155 Test: generate copy: iovecs-len validate ...[2024-04-27 06:49:46.925153] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:17.155 passed 00:07:17.155 Test: generate copy: buffer alignment validate ...passed 00:07:17.155 00:07:17.155 Run Summary: Type Total Ran Passed Failed Inactive 00:07:17.155 suites 1 1 n/a 0 0 00:07:17.155 tests 20 20 20 0 0 00:07:17.155 asserts 204 204 204 0 n/a 00:07:17.155 00:07:17.155 Elapsed time = 0.000 seconds 00:07:17.414 00:07:17.414 real 0m0.341s 00:07:17.414 user 0m0.521s 00:07:17.414 sys 0m0.156s 00:07:17.414 06:49:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.414 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.414 ************************************ 00:07:17.414 END TEST accel_dif_functional_tests 00:07:17.414 ************************************ 00:07:17.414 00:07:17.414 real 0m55.188s 00:07:17.414 user 1m2.757s 00:07:17.414 sys 0m7.101s 00:07:17.414 06:49:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.414 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.414 ************************************ 00:07:17.414 END TEST accel 00:07:17.414 ************************************ 00:07:17.414 06:49:47 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:17.414 06:49:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:17.414 06:49:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.414 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.414 ************************************ 00:07:17.414 START TEST accel_rpc 00:07:17.414 ************************************ 00:07:17.414 06:49:47 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:17.414 * Looking for test storage... 00:07:17.414 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:17.414 06:49:47 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:17.414 06:49:47 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2617721 00:07:17.414 06:49:47 -- accel/accel_rpc.sh@15 -- # waitforlisten 2617721 00:07:17.414 06:49:47 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:17.414 06:49:47 -- common/autotest_common.sh@819 -- # '[' -z 2617721 ']' 00:07:17.414 06:49:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:17.414 06:49:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:17.414 06:49:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:17.414 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:17.414 06:49:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:17.414 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.414 [2024-04-27 06:49:47.292846] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:17.414 [2024-04-27 06:49:47.292925] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2617721 ] 00:07:17.673 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.673 [2024-04-27 06:49:47.364174] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.673 [2024-04-27 06:49:47.402347] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:17.673 [2024-04-27 06:49:47.402475] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.673 06:49:47 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:17.673 06:49:47 -- common/autotest_common.sh@852 -- # return 0 00:07:17.673 06:49:47 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:17.673 06:49:47 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:17.673 06:49:47 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:17.673 06:49:47 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:17.673 06:49:47 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:17.673 06:49:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:17.673 06:49:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.673 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.673 ************************************ 00:07:17.673 START TEST accel_assign_opcode 00:07:17.673 ************************************ 00:07:17.673 06:49:47 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:07:17.673 06:49:47 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:17.673 06:49:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:17.673 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.673 [2024-04-27 06:49:47.454928] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:17.673 06:49:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:17.673 06:49:47 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:17.673 06:49:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:17.673 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.673 [2024-04-27 06:49:47.462939] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:17.673 06:49:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:17.673 06:49:47 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:17.673 06:49:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:17.673 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.932 06:49:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:17.932 06:49:47 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:17.932 06:49:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:17.932 06:49:47 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:17.932 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.932 06:49:47 -- accel/accel_rpc.sh@42 -- # grep software 00:07:17.932 06:49:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:17.932 software 00:07:17.932 00:07:17.932 real 0m0.216s 00:07:17.932 user 0m0.045s 00:07:17.932 sys 0m0.013s 00:07:17.932 06:49:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.932 06:49:47 -- common/autotest_common.sh@10 -- # set +x 00:07:17.932 ************************************ 00:07:17.932 END TEST accel_assign_opcode 00:07:17.932 ************************************ 00:07:17.932 06:49:47 -- accel/accel_rpc.sh@55 -- # killprocess 2617721 00:07:17.932 06:49:47 -- common/autotest_common.sh@926 -- # '[' -z 2617721 ']' 00:07:17.932 06:49:47 -- common/autotest_common.sh@930 -- # kill -0 2617721 00:07:17.932 06:49:47 -- common/autotest_common.sh@931 -- # uname 00:07:17.932 06:49:47 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:17.932 06:49:47 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2617721 00:07:17.932 06:49:47 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:17.932 06:49:47 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:17.932 06:49:47 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2617721' 00:07:17.932 killing process with pid 2617721 00:07:17.932 06:49:47 -- common/autotest_common.sh@945 -- # kill 2617721 00:07:17.932 06:49:47 -- common/autotest_common.sh@950 -- # wait 2617721 00:07:18.191 00:07:18.191 real 0m0.882s 00:07:18.191 user 0m0.781s 00:07:18.191 sys 0m0.438s 00:07:18.191 06:49:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.191 06:49:48 -- common/autotest_common.sh@10 -- # set +x 00:07:18.191 ************************************ 00:07:18.191 END TEST accel_rpc 00:07:18.191 ************************************ 00:07:18.450 06:49:48 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:18.450 06:49:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:18.450 06:49:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:18.450 06:49:48 -- common/autotest_common.sh@10 -- # set +x 00:07:18.450 ************************************ 00:07:18.450 START TEST app_cmdline 00:07:18.450 ************************************ 00:07:18.450 06:49:48 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:18.450 * Looking for test storage... 00:07:18.450 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:18.450 06:49:48 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:18.450 06:49:48 -- app/cmdline.sh@17 -- # spdk_tgt_pid=2617841 00:07:18.450 06:49:48 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:18.450 06:49:48 -- app/cmdline.sh@18 -- # waitforlisten 2617841 00:07:18.450 06:49:48 -- common/autotest_common.sh@819 -- # '[' -z 2617841 ']' 00:07:18.450 06:49:48 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:18.450 06:49:48 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:18.450 06:49:48 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:18.450 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:18.450 06:49:48 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:18.450 06:49:48 -- common/autotest_common.sh@10 -- # set +x 00:07:18.450 [2024-04-27 06:49:48.215023] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:18.450 [2024-04-27 06:49:48.215112] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2617841 ] 00:07:18.450 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.450 [2024-04-27 06:49:48.285246] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.450 [2024-04-27 06:49:48.322682] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:18.450 [2024-04-27 06:49:48.322798] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.387 06:49:49 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:19.387 06:49:49 -- common/autotest_common.sh@852 -- # return 0 00:07:19.387 06:49:49 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:19.387 { 00:07:19.387 "version": "SPDK v24.01.1-pre git sha1 36faa8c31", 00:07:19.387 "fields": { 00:07:19.387 "major": 24, 00:07:19.387 "minor": 1, 00:07:19.387 "patch": 1, 00:07:19.387 "suffix": "-pre", 00:07:19.387 "commit": "36faa8c31" 00:07:19.387 } 00:07:19.387 } 00:07:19.387 06:49:49 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:19.387 06:49:49 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:19.387 06:49:49 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:19.387 06:49:49 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:19.387 06:49:49 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:19.387 06:49:49 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.387 06:49:49 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:19.387 06:49:49 -- common/autotest_common.sh@10 -- # set +x 00:07:19.387 06:49:49 -- app/cmdline.sh@26 -- # sort 00:07:19.387 06:49:49 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.387 06:49:49 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:19.387 06:49:49 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:19.387 06:49:49 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.387 06:49:49 -- common/autotest_common.sh@640 -- # local es=0 00:07:19.387 06:49:49 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.387 06:49:49 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:19.387 06:49:49 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:19.387 06:49:49 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:19.387 06:49:49 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:19.387 06:49:49 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:19.387 06:49:49 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:19.387 06:49:49 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:19.387 06:49:49 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:19.387 06:49:49 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.647 request: 00:07:19.647 { 00:07:19.647 "method": "env_dpdk_get_mem_stats", 00:07:19.647 "req_id": 1 00:07:19.647 } 00:07:19.647 Got JSON-RPC error response 00:07:19.647 response: 00:07:19.647 { 00:07:19.647 "code": -32601, 00:07:19.648 "message": "Method not found" 00:07:19.648 } 00:07:19.648 06:49:49 -- common/autotest_common.sh@643 -- # es=1 00:07:19.648 06:49:49 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:19.648 06:49:49 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:19.648 06:49:49 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:19.648 06:49:49 -- app/cmdline.sh@1 -- # killprocess 2617841 00:07:19.648 06:49:49 -- common/autotest_common.sh@926 -- # '[' -z 2617841 ']' 00:07:19.648 06:49:49 -- common/autotest_common.sh@930 -- # kill -0 2617841 00:07:19.648 06:49:49 -- common/autotest_common.sh@931 -- # uname 00:07:19.648 06:49:49 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:19.648 06:49:49 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2617841 00:07:19.648 06:49:49 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:19.648 06:49:49 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:19.648 06:49:49 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2617841' 00:07:19.648 killing process with pid 2617841 00:07:19.648 06:49:49 -- common/autotest_common.sh@945 -- # kill 2617841 00:07:19.648 06:49:49 -- common/autotest_common.sh@950 -- # wait 2617841 00:07:19.908 00:07:19.908 real 0m1.638s 00:07:19.908 user 0m1.892s 00:07:19.908 sys 0m0.486s 00:07:19.908 06:49:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.908 06:49:49 -- common/autotest_common.sh@10 -- # set +x 00:07:19.908 ************************************ 00:07:19.908 END TEST app_cmdline 00:07:19.908 ************************************ 00:07:19.908 06:49:49 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:19.908 06:49:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:19.908 06:49:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:19.908 06:49:49 -- common/autotest_common.sh@10 -- # set +x 00:07:19.908 ************************************ 00:07:19.908 START TEST version 00:07:19.908 ************************************ 00:07:19.908 06:49:49 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:20.168 * Looking for test storage... 00:07:20.168 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:20.168 06:49:49 -- app/version.sh@17 -- # get_header_version major 00:07:20.168 06:49:49 -- app/version.sh@14 -- # cut -f2 00:07:20.168 06:49:49 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:20.168 06:49:49 -- app/version.sh@14 -- # tr -d '"' 00:07:20.168 06:49:49 -- app/version.sh@17 -- # major=24 00:07:20.168 06:49:49 -- app/version.sh@18 -- # get_header_version minor 00:07:20.168 06:49:49 -- app/version.sh@14 -- # cut -f2 00:07:20.168 06:49:49 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:20.168 06:49:49 -- app/version.sh@14 -- # tr -d '"' 00:07:20.168 06:49:49 -- app/version.sh@18 -- # minor=1 00:07:20.168 06:49:49 -- app/version.sh@19 -- # get_header_version patch 00:07:20.168 06:49:49 -- app/version.sh@14 -- # cut -f2 00:07:20.168 06:49:49 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:20.168 06:49:49 -- app/version.sh@14 -- # tr -d '"' 00:07:20.168 06:49:49 -- app/version.sh@19 -- # patch=1 00:07:20.168 06:49:49 -- app/version.sh@20 -- # get_header_version suffix 00:07:20.168 06:49:49 -- app/version.sh@14 -- # cut -f2 00:07:20.168 06:49:49 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:20.168 06:49:49 -- app/version.sh@14 -- # tr -d '"' 00:07:20.168 06:49:49 -- app/version.sh@20 -- # suffix=-pre 00:07:20.168 06:49:49 -- app/version.sh@22 -- # version=24.1 00:07:20.168 06:49:49 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:20.168 06:49:49 -- app/version.sh@25 -- # version=24.1.1 00:07:20.168 06:49:49 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:20.168 06:49:49 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:20.168 06:49:49 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:20.168 06:49:49 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:20.168 06:49:49 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:20.168 00:07:20.168 real 0m0.178s 00:07:20.168 user 0m0.086s 00:07:20.168 sys 0m0.130s 00:07:20.168 06:49:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.168 06:49:49 -- common/autotest_common.sh@10 -- # set +x 00:07:20.168 ************************************ 00:07:20.168 END TEST version 00:07:20.168 ************************************ 00:07:20.168 06:49:49 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:49 -- spdk/autotest.sh@204 -- # uname -s 00:07:20.168 06:49:50 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:07:20.168 06:49:50 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:20.168 06:49:50 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:20.168 06:49:50 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@268 -- # timing_exit lib 00:07:20.168 06:49:50 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:20.168 06:49:50 -- common/autotest_common.sh@10 -- # set +x 00:07:20.168 06:49:50 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:20.168 06:49:50 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:20.168 06:49:50 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:20.168 06:49:50 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:20.168 06:49:50 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:20.168 06:49:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:20.168 06:49:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.168 06:49:50 -- common/autotest_common.sh@10 -- # set +x 00:07:20.168 ************************************ 00:07:20.168 START TEST llvm_fuzz 00:07:20.168 ************************************ 00:07:20.168 06:49:50 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:20.429 * Looking for test storage... 00:07:20.429 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:20.429 06:49:50 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:20.429 06:49:50 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:20.429 06:49:50 -- common/autotest_common.sh@538 -- # fuzzers=() 00:07:20.429 06:49:50 -- common/autotest_common.sh@538 -- # local fuzzers 00:07:20.429 06:49:50 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:07:20.429 06:49:50 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:20.429 06:49:50 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:20.429 06:49:50 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:20.429 06:49:50 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:20.429 06:49:50 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:20.429 06:49:50 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:20.429 06:49:50 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:20.429 06:49:50 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:20.429 06:49:50 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:20.429 06:49:50 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:20.429 06:49:50 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:20.429 06:49:50 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:20.429 06:49:50 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:20.429 06:49:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:20.429 06:49:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.429 06:49:50 -- common/autotest_common.sh@10 -- # set +x 00:07:20.429 ************************************ 00:07:20.429 START TEST nvmf_fuzz 00:07:20.429 ************************************ 00:07:20.429 06:49:50 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:20.429 * Looking for test storage... 00:07:20.429 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:20.429 06:49:50 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:20.429 06:49:50 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:20.429 06:49:50 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:20.429 06:49:50 -- common/autotest_common.sh@34 -- # set -e 00:07:20.429 06:49:50 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:20.429 06:49:50 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:20.429 06:49:50 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:20.429 06:49:50 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:20.429 06:49:50 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:20.429 06:49:50 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:20.429 06:49:50 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:20.429 06:49:50 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:20.429 06:49:50 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:20.429 06:49:50 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:20.429 06:49:50 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:20.429 06:49:50 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:20.429 06:49:50 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:20.429 06:49:50 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:20.429 06:49:50 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:20.429 06:49:50 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:20.429 06:49:50 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:20.429 06:49:50 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:20.429 06:49:50 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:20.429 06:49:50 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:20.429 06:49:50 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:20.429 06:49:50 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:20.429 06:49:50 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:20.429 06:49:50 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:20.429 06:49:50 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:20.429 06:49:50 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:20.429 06:49:50 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:20.429 06:49:50 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:20.429 06:49:50 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:20.429 06:49:50 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:20.429 06:49:50 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:20.429 06:49:50 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:20.429 06:49:50 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:20.429 06:49:50 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:20.429 06:49:50 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:20.429 06:49:50 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:20.429 06:49:50 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:20.429 06:49:50 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:20.429 06:49:50 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:20.429 06:49:50 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:20.429 06:49:50 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:20.429 06:49:50 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:20.429 06:49:50 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:20.429 06:49:50 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:20.429 06:49:50 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:20.429 06:49:50 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:20.429 06:49:50 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:20.429 06:49:50 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:20.429 06:49:50 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:20.429 06:49:50 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:20.429 06:49:50 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:20.429 06:49:50 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:20.429 06:49:50 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:20.429 06:49:50 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:20.429 06:49:50 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:20.429 06:49:50 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:20.429 06:49:50 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:20.429 06:49:50 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:20.429 06:49:50 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:20.429 06:49:50 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:20.429 06:49:50 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:20.429 06:49:50 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:20.429 06:49:50 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:20.429 06:49:50 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=n 00:07:20.429 06:49:50 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:20.429 06:49:50 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:20.429 06:49:50 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:20.429 06:49:50 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:20.429 06:49:50 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:20.429 06:49:50 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:20.429 06:49:50 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:20.429 06:49:50 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:20.429 06:49:50 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:20.429 06:49:50 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:20.429 06:49:50 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:20.429 06:49:50 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:20.429 06:49:50 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:20.429 06:49:50 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:20.430 06:49:50 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:20.430 06:49:50 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:20.430 06:49:50 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:20.430 06:49:50 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:20.430 06:49:50 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:20.430 06:49:50 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:20.430 06:49:50 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:20.430 06:49:50 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:20.430 06:49:50 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:20.430 06:49:50 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:20.430 06:49:50 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:20.430 06:49:50 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:20.430 06:49:50 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:20.430 06:49:50 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:20.430 06:49:50 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:20.430 06:49:50 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:20.430 06:49:50 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:20.430 06:49:50 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:20.430 06:49:50 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:20.430 06:49:50 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:20.430 06:49:50 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:20.430 #define SPDK_CONFIG_H 00:07:20.430 #define SPDK_CONFIG_APPS 1 00:07:20.430 #define SPDK_CONFIG_ARCH native 00:07:20.430 #undef SPDK_CONFIG_ASAN 00:07:20.430 #undef SPDK_CONFIG_AVAHI 00:07:20.430 #undef SPDK_CONFIG_CET 00:07:20.430 #define SPDK_CONFIG_COVERAGE 1 00:07:20.430 #define SPDK_CONFIG_CROSS_PREFIX 00:07:20.430 #undef SPDK_CONFIG_CRYPTO 00:07:20.430 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:20.430 #undef SPDK_CONFIG_CUSTOMOCF 00:07:20.430 #undef SPDK_CONFIG_DAOS 00:07:20.430 #define SPDK_CONFIG_DAOS_DIR 00:07:20.430 #define SPDK_CONFIG_DEBUG 1 00:07:20.430 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:20.430 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:20.430 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:20.430 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:20.430 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:20.430 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:20.430 #define SPDK_CONFIG_EXAMPLES 1 00:07:20.430 #undef SPDK_CONFIG_FC 00:07:20.430 #define SPDK_CONFIG_FC_PATH 00:07:20.430 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:20.430 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:20.430 #undef SPDK_CONFIG_FUSE 00:07:20.430 #define SPDK_CONFIG_FUZZER 1 00:07:20.430 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:20.430 #undef SPDK_CONFIG_GOLANG 00:07:20.430 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:20.430 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:20.430 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:20.430 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:20.430 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:20.430 #define SPDK_CONFIG_IDXD 1 00:07:20.430 #undef SPDK_CONFIG_IDXD_KERNEL 00:07:20.430 #undef SPDK_CONFIG_IPSEC_MB 00:07:20.430 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:20.430 #define SPDK_CONFIG_ISAL 1 00:07:20.430 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:20.430 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:20.430 #define SPDK_CONFIG_LIBDIR 00:07:20.430 #undef SPDK_CONFIG_LTO 00:07:20.430 #define SPDK_CONFIG_MAX_LCORES 00:07:20.430 #define SPDK_CONFIG_NVME_CUSE 1 00:07:20.430 #undef SPDK_CONFIG_OCF 00:07:20.430 #define SPDK_CONFIG_OCF_PATH 00:07:20.430 #define SPDK_CONFIG_OPENSSL_PATH 00:07:20.430 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:20.430 #undef SPDK_CONFIG_PGO_USE 00:07:20.430 #define SPDK_CONFIG_PREFIX /usr/local 00:07:20.430 #undef SPDK_CONFIG_RAID5F 00:07:20.430 #undef SPDK_CONFIG_RBD 00:07:20.430 #define SPDK_CONFIG_RDMA 1 00:07:20.430 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:20.430 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:20.430 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:20.430 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:20.430 #undef SPDK_CONFIG_SHARED 00:07:20.430 #undef SPDK_CONFIG_SMA 00:07:20.430 #define SPDK_CONFIG_TESTS 1 00:07:20.430 #undef SPDK_CONFIG_TSAN 00:07:20.430 #define SPDK_CONFIG_UBLK 1 00:07:20.430 #define SPDK_CONFIG_UBSAN 1 00:07:20.430 #undef SPDK_CONFIG_UNIT_TESTS 00:07:20.430 #undef SPDK_CONFIG_URING 00:07:20.430 #define SPDK_CONFIG_URING_PATH 00:07:20.430 #undef SPDK_CONFIG_URING_ZNS 00:07:20.430 #undef SPDK_CONFIG_USDT 00:07:20.430 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:20.430 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:20.430 #define SPDK_CONFIG_VFIO_USER 1 00:07:20.430 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:20.430 #define SPDK_CONFIG_VHOST 1 00:07:20.430 #define SPDK_CONFIG_VIRTIO 1 00:07:20.430 #undef SPDK_CONFIG_VTUNE 00:07:20.430 #define SPDK_CONFIG_VTUNE_DIR 00:07:20.430 #define SPDK_CONFIG_WERROR 1 00:07:20.430 #define SPDK_CONFIG_WPDK_DIR 00:07:20.430 #undef SPDK_CONFIG_XNVME 00:07:20.430 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:20.430 06:49:50 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:20.430 06:49:50 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:20.430 06:49:50 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:20.430 06:49:50 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:20.430 06:49:50 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:20.430 06:49:50 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.430 06:49:50 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.430 06:49:50 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.430 06:49:50 -- paths/export.sh@5 -- # export PATH 00:07:20.430 06:49:50 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.430 06:49:50 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:20.430 06:49:50 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:20.430 06:49:50 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:20.692 06:49:50 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:20.692 06:49:50 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:20.692 06:49:50 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:20.692 06:49:50 -- pm/common@16 -- # TEST_TAG=N/A 00:07:20.692 06:49:50 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:20.692 06:49:50 -- common/autotest_common.sh@52 -- # : 1 00:07:20.692 06:49:50 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:20.692 06:49:50 -- common/autotest_common.sh@56 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:20.692 06:49:50 -- common/autotest_common.sh@58 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:20.692 06:49:50 -- common/autotest_common.sh@60 -- # : 1 00:07:20.692 06:49:50 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:20.692 06:49:50 -- common/autotest_common.sh@62 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:20.692 06:49:50 -- common/autotest_common.sh@64 -- # : 00:07:20.692 06:49:50 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:20.692 06:49:50 -- common/autotest_common.sh@66 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:20.692 06:49:50 -- common/autotest_common.sh@68 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:20.692 06:49:50 -- common/autotest_common.sh@70 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:20.692 06:49:50 -- common/autotest_common.sh@72 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:20.692 06:49:50 -- common/autotest_common.sh@74 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:20.692 06:49:50 -- common/autotest_common.sh@76 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:20.692 06:49:50 -- common/autotest_common.sh@78 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:20.692 06:49:50 -- common/autotest_common.sh@80 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:20.692 06:49:50 -- common/autotest_common.sh@82 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:20.692 06:49:50 -- common/autotest_common.sh@84 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:20.692 06:49:50 -- common/autotest_common.sh@86 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:20.692 06:49:50 -- common/autotest_common.sh@88 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:20.692 06:49:50 -- common/autotest_common.sh@90 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:20.692 06:49:50 -- common/autotest_common.sh@92 -- # : 1 00:07:20.692 06:49:50 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:20.692 06:49:50 -- common/autotest_common.sh@94 -- # : 1 00:07:20.692 06:49:50 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:20.692 06:49:50 -- common/autotest_common.sh@96 -- # : rdma 00:07:20.692 06:49:50 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:20.692 06:49:50 -- common/autotest_common.sh@98 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:20.692 06:49:50 -- common/autotest_common.sh@100 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:20.692 06:49:50 -- common/autotest_common.sh@102 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:20.692 06:49:50 -- common/autotest_common.sh@104 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:20.692 06:49:50 -- common/autotest_common.sh@106 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:20.692 06:49:50 -- common/autotest_common.sh@108 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:20.692 06:49:50 -- common/autotest_common.sh@110 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:20.692 06:49:50 -- common/autotest_common.sh@112 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:20.692 06:49:50 -- common/autotest_common.sh@114 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:20.692 06:49:50 -- common/autotest_common.sh@116 -- # : 1 00:07:20.692 06:49:50 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:20.692 06:49:50 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:20.692 06:49:50 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:20.692 06:49:50 -- common/autotest_common.sh@120 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:20.692 06:49:50 -- common/autotest_common.sh@122 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:20.692 06:49:50 -- common/autotest_common.sh@124 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:20.692 06:49:50 -- common/autotest_common.sh@126 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:20.692 06:49:50 -- common/autotest_common.sh@128 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:20.692 06:49:50 -- common/autotest_common.sh@130 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:20.692 06:49:50 -- common/autotest_common.sh@132 -- # : v23.11 00:07:20.692 06:49:50 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:20.692 06:49:50 -- common/autotest_common.sh@134 -- # : true 00:07:20.692 06:49:50 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:20.692 06:49:50 -- common/autotest_common.sh@136 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:20.692 06:49:50 -- common/autotest_common.sh@138 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:20.692 06:49:50 -- common/autotest_common.sh@140 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:20.692 06:49:50 -- common/autotest_common.sh@142 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:20.692 06:49:50 -- common/autotest_common.sh@144 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:20.692 06:49:50 -- common/autotest_common.sh@146 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:20.692 06:49:50 -- common/autotest_common.sh@148 -- # : 00:07:20.692 06:49:50 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:20.692 06:49:50 -- common/autotest_common.sh@150 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:20.692 06:49:50 -- common/autotest_common.sh@152 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:20.692 06:49:50 -- common/autotest_common.sh@154 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:20.692 06:49:50 -- common/autotest_common.sh@156 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:20.692 06:49:50 -- common/autotest_common.sh@158 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:20.692 06:49:50 -- common/autotest_common.sh@160 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:20.692 06:49:50 -- common/autotest_common.sh@163 -- # : 00:07:20.692 06:49:50 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:20.692 06:49:50 -- common/autotest_common.sh@165 -- # : 0 00:07:20.692 06:49:50 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:20.693 06:49:50 -- common/autotest_common.sh@167 -- # : 0 00:07:20.693 06:49:50 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:20.693 06:49:50 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:20.693 06:49:50 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:20.693 06:49:50 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:20.693 06:49:50 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:20.693 06:49:50 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:20.693 06:49:50 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:20.693 06:49:50 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:20.693 06:49:50 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:20.693 06:49:50 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:20.693 06:49:50 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:20.693 06:49:50 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:20.693 06:49:50 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:20.693 06:49:50 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:20.693 06:49:50 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:20.693 06:49:50 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:20.693 06:49:50 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:20.693 06:49:50 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:20.693 06:49:50 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:20.693 06:49:50 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:20.693 06:49:50 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:20.693 06:49:50 -- common/autotest_common.sh@196 -- # cat 00:07:20.693 06:49:50 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:20.693 06:49:50 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:20.693 06:49:50 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:20.693 06:49:50 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:20.693 06:49:50 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:20.693 06:49:50 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:20.693 06:49:50 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:20.693 06:49:50 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:20.693 06:49:50 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:20.693 06:49:50 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:20.693 06:49:50 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:20.693 06:49:50 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:20.693 06:49:50 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:20.693 06:49:50 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:20.693 06:49:50 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:20.693 06:49:50 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:20.693 06:49:50 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:20.693 06:49:50 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:20.693 06:49:50 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:20.693 06:49:50 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:20.693 06:49:50 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:20.693 06:49:50 -- common/autotest_common.sh@249 -- # valgrind= 00:07:20.693 06:49:50 -- common/autotest_common.sh@255 -- # uname -s 00:07:20.693 06:49:50 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:20.693 06:49:50 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:20.693 06:49:50 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:20.693 06:49:50 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:20.693 06:49:50 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:20.693 06:49:50 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:20.693 06:49:50 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:20.693 06:49:50 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:20.693 06:49:50 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:20.693 06:49:50 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:20.693 06:49:50 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:20.693 06:49:50 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:20.693 06:49:50 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:20.693 06:49:50 -- common/autotest_common.sh@309 -- # [[ -z 2618417 ]] 00:07:20.693 06:49:50 -- common/autotest_common.sh@309 -- # kill -0 2618417 00:07:20.693 06:49:50 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:20.693 06:49:50 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:20.693 06:49:50 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:20.693 06:49:50 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:20.693 06:49:50 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:20.693 06:49:50 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:20.693 06:49:50 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:20.693 06:49:50 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:20.693 06:49:50 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.KxRD9R 00:07:20.693 06:49:50 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:20.693 06:49:50 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:20.693 06:49:50 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:20.693 06:49:50 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.KxRD9R/tests/nvmf /tmp/spdk.KxRD9R 00:07:20.693 06:49:50 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:20.693 06:49:50 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:20.693 06:49:50 -- common/autotest_common.sh@318 -- # df -T 00:07:20.693 06:49:50 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:20.693 06:49:50 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:20.693 06:49:50 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:20.693 06:49:50 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:20.693 06:49:50 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:20.693 06:49:50 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:20.693 06:49:50 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:20.693 06:49:50 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:20.693 06:49:50 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:20.693 06:49:50 -- common/autotest_common.sh@353 -- # avails["$mount"]=1052192768 00:07:20.693 06:49:50 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:20.693 06:49:50 -- common/autotest_common.sh@354 -- # uses["$mount"]=4232237056 00:07:20.693 06:49:50 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:20.693 06:49:50 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:20.693 06:49:50 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:20.693 06:49:50 -- common/autotest_common.sh@353 -- # avails["$mount"]=52188147712 00:07:20.693 06:49:50 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742297088 00:07:20.693 06:49:50 -- common/autotest_common.sh@354 -- # uses["$mount"]=9554149376 00:07:20.693 06:49:50 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:20.693 06:49:50 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:20.693 06:49:50 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:20.694 06:49:50 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868553728 00:07:20.694 06:49:50 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871146496 00:07:20.694 06:49:50 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:07:20.694 06:49:50 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:20.694 06:49:50 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:20.694 06:49:50 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:20.694 06:49:50 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342489088 00:07:20.694 06:49:50 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348461056 00:07:20.694 06:49:50 -- common/autotest_common.sh@354 -- # uses["$mount"]=5971968 00:07:20.694 06:49:50 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:20.694 06:49:50 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:20.694 06:49:50 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:20.694 06:49:50 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870499328 00:07:20.694 06:49:50 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871150592 00:07:20.694 06:49:50 -- common/autotest_common.sh@354 -- # uses["$mount"]=651264 00:07:20.694 06:49:50 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:20.694 06:49:50 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:20.694 06:49:50 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:20.694 06:49:50 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:07:20.694 06:49:50 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:07:20.694 06:49:50 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:07:20.694 06:49:50 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:20.694 06:49:50 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:20.694 * Looking for test storage... 00:07:20.694 06:49:50 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:20.694 06:49:50 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:20.694 06:49:50 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:20.694 06:49:50 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:20.694 06:49:50 -- common/autotest_common.sh@363 -- # mount=/ 00:07:20.694 06:49:50 -- common/autotest_common.sh@365 -- # target_space=52188147712 00:07:20.694 06:49:50 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:20.694 06:49:50 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:20.694 06:49:50 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:20.694 06:49:50 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:20.694 06:49:50 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:20.694 06:49:50 -- common/autotest_common.sh@372 -- # new_size=11768741888 00:07:20.694 06:49:50 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:20.694 06:49:50 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:20.694 06:49:50 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:20.694 06:49:50 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:20.694 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:20.694 06:49:50 -- common/autotest_common.sh@380 -- # return 0 00:07:20.694 06:49:50 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:20.694 06:49:50 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:20.694 06:49:50 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:20.694 06:49:50 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:20.694 06:49:50 -- common/autotest_common.sh@1672 -- # true 00:07:20.694 06:49:50 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:20.694 06:49:50 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:20.694 06:49:50 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:20.694 06:49:50 -- common/autotest_common.sh@27 -- # exec 00:07:20.694 06:49:50 -- common/autotest_common.sh@29 -- # exec 00:07:20.694 06:49:50 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:20.694 06:49:50 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:20.694 06:49:50 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:20.694 06:49:50 -- common/autotest_common.sh@18 -- # set -x 00:07:20.694 06:49:50 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:20.694 06:49:50 -- ../common.sh@8 -- # pids=() 00:07:20.694 06:49:50 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:20.694 06:49:50 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:20.694 06:49:50 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:20.694 06:49:50 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:20.694 06:49:50 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:20.694 06:49:50 -- nvmf/run.sh@61 -- # mem_size=512 00:07:20.694 06:49:50 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:20.694 06:49:50 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:20.694 06:49:50 -- ../common.sh@69 -- # local fuzz_num=25 00:07:20.694 06:49:50 -- ../common.sh@70 -- # local time=1 00:07:20.694 06:49:50 -- ../common.sh@72 -- # (( i = 0 )) 00:07:20.694 06:49:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:20.694 06:49:50 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:20.694 06:49:50 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:20.694 06:49:50 -- nvmf/run.sh@24 -- # local timen=1 00:07:20.694 06:49:50 -- nvmf/run.sh@25 -- # local core=0x1 00:07:20.694 06:49:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:20.694 06:49:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:20.694 06:49:50 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:20.694 06:49:50 -- nvmf/run.sh@29 -- # port=4400 00:07:20.694 06:49:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:20.694 06:49:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:20.694 06:49:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:20.694 06:49:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:20.694 [2024-04-27 06:49:50.481701] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:20.694 [2024-04-27 06:49:50.481785] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2618557 ] 00:07:20.694 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.953 [2024-04-27 06:49:50.662277] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.953 [2024-04-27 06:49:50.682623] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:20.953 [2024-04-27 06:49:50.682753] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.953 [2024-04-27 06:49:50.734350] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:20.953 [2024-04-27 06:49:50.750668] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:20.953 INFO: Running with entropic power schedule (0xFF, 100). 00:07:20.953 INFO: Seed: 2231426761 00:07:20.953 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:20.953 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:20.953 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:20.953 INFO: A corpus is not provided, starting from an empty corpus 00:07:20.953 #2 INITED exec/s: 0 rss: 59Mb 00:07:20.953 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:20.953 This may also happen if the target rejected all inputs we tried so far 00:07:20.953 [2024-04-27 06:49:50.799785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b5) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-04-27 06:49:50.799814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.212 NEW_FUNC[1/661]: 0x49d5e0 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:21.212 NEW_FUNC[2/661]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:21.212 #10 NEW cov: 11440 ft: 11466 corp: 2/126b lim: 320 exec/s: 0 rss: 67Mb L: 125/125 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:21.472 [2024-04-27 06:49:51.110607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.472 [2024-04-27 06:49:51.110640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.472 NEW_FUNC[1/3]: 0x1c79240 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:795 00:07:21.472 NEW_FUNC[2/3]: 0x1c793e0 in spdk_thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1151 00:07:21.472 #13 NEW cov: 11582 ft: 12208 corp: 3/231b lim: 320 exec/s: 0 rss: 68Mb L: 105/125 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:07:21.472 [2024-04-27 06:49:51.150638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.472 [2024-04-27 06:49:51.150664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.472 #17 NEW cov: 11589 ft: 12401 corp: 4/337b lim: 320 exec/s: 0 rss: 68Mb L: 106/125 MS: 4 CopyPart-ChangeBit-ShuffleBytes-CrossOver- 00:07:21.472 [2024-04-27 06:49:51.180728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.472 [2024-04-27 06:49:51.180754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.472 #18 NEW cov: 11674 ft: 12733 corp: 5/442b lim: 320 exec/s: 0 rss: 68Mb L: 105/125 MS: 1 ChangeBinInt- 00:07:21.472 [2024-04-27 06:49:51.220848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.472 [2024-04-27 06:49:51.220873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.472 #19 NEW cov: 11674 ft: 12821 corp: 6/567b lim: 320 exec/s: 0 rss: 68Mb L: 125/125 MS: 1 CopyPart- 00:07:21.472 [2024-04-27 06:49:51.260936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.472 [2024-04-27 06:49:51.260961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.472 #20 NEW cov: 11674 ft: 12964 corp: 7/693b lim: 320 exec/s: 0 rss: 68Mb L: 126/126 MS: 1 InsertByte- 00:07:21.472 [2024-04-27 06:49:51.301105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.472 [2024-04-27 06:49:51.301130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.472 #21 NEW cov: 11674 ft: 13048 corp: 8/799b lim: 320 exec/s: 0 rss: 68Mb L: 106/126 MS: 1 ChangeBinInt- 00:07:21.472 [2024-04-27 06:49:51.341203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.472 [2024-04-27 06:49:51.341227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.472 #22 NEW cov: 11674 ft: 13067 corp: 9/924b lim: 320 exec/s: 0 rss: 68Mb L: 125/126 MS: 1 CrossOver- 00:07:21.732 [2024-04-27 06:49:51.381370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.732 [2024-04-27 06:49:51.381402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.732 #23 NEW cov: 11674 ft: 13117 corp: 10/1030b lim: 320 exec/s: 0 rss: 69Mb L: 106/126 MS: 1 ChangeBinInt- 00:07:21.732 [2024-04-27 06:49:51.421440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.732 [2024-04-27 06:49:51.421465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.732 #24 NEW cov: 11674 ft: 13230 corp: 11/1136b lim: 320 exec/s: 0 rss: 69Mb L: 106/126 MS: 1 ChangeBit- 00:07:21.732 [2024-04-27 06:49:51.461513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000000a 00:07:21.732 [2024-04-27 06:49:51.461539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.732 #25 NEW cov: 11674 ft: 13330 corp: 12/1261b lim: 320 exec/s: 0 rss: 69Mb L: 125/126 MS: 1 CrossOver- 00:07:21.732 [2024-04-27 06:49:51.491609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.732 [2024-04-27 06:49:51.491636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.732 #26 NEW cov: 11674 ft: 13350 corp: 13/1328b lim: 320 exec/s: 0 rss: 69Mb L: 67/126 MS: 1 EraseBytes- 00:07:21.732 [2024-04-27 06:49:51.531755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.732 [2024-04-27 06:49:51.531782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.732 #27 NEW cov: 11674 ft: 13451 corp: 14/1442b lim: 320 exec/s: 0 rss: 69Mb L: 114/126 MS: 1 CMP- DE: "=\325\302\240\343\021x\000"- 00:07:21.732 [2024-04-27 06:49:51.571856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.732 [2024-04-27 06:49:51.571885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.732 #28 NEW cov: 11674 ft: 13511 corp: 15/1548b lim: 320 exec/s: 0 rss: 69Mb L: 106/126 MS: 1 CMP- DE: "\377\377\011x"- 00:07:21.732 [2024-04-27 06:49:51.611917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.732 [2024-04-27 06:49:51.611942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.992 #29 NEW cov: 11674 ft: 13645 corp: 16/1673b lim: 320 exec/s: 0 rss: 69Mb L: 125/126 MS: 1 ChangeBit- 00:07:21.992 [2024-04-27 06:49:51.642202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.992 [2024-04-27 06:49:51.642228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.642308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:21.992 [2024-04-27 06:49:51.642324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.642385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:21.992 [2024-04-27 06:49:51.642404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.992 NEW_FUNC[1/1]: 0x12f6060 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:07:21.992 #30 NEW cov: 11705 ft: 13908 corp: 17/1921b lim: 320 exec/s: 0 rss: 69Mb L: 248/248 MS: 1 InsertRepeatedBytes- 00:07:21.992 [2024-04-27 06:49:51.682428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b5) qid:0 cid:4 nsid:ffffffff cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.992 [2024-04-27 06:49:51.682459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.682519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:5 nsid:44444444 cdw10:ffffffff cdw11:ffffffff 00:07:21.992 [2024-04-27 06:49:51.682532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.682594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:21.992 [2024-04-27 06:49:51.682608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.992 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:21.992 #31 NEW cov: 11728 ft: 14086 corp: 18/2117b lim: 320 exec/s: 0 rss: 69Mb L: 196/248 MS: 1 InsertRepeatedBytes- 00:07:21.992 [2024-04-27 06:49:51.722654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b5) qid:0 cid:4 nsid:ffffffff cdw10:44000000 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.992 [2024-04-27 06:49:51.722680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.722739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:5 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:21.992 [2024-04-27 06:49:51.722753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.722810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:21.992 [2024-04-27 06:49:51.722827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.722888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:7 nsid:ffff0000 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.992 [2024-04-27 06:49:51.722901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.992 #32 NEW cov: 11728 ft: 14273 corp: 19/2376b lim: 320 exec/s: 0 rss: 69Mb L: 259/259 MS: 1 CrossOver- 00:07:21.992 [2024-04-27 06:49:51.762398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:b50000 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x5c5c5c5c5c5c5c5c 00:07:21.992 [2024-04-27 06:49:51.762423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.992 #34 NEW cov: 11728 ft: 14287 corp: 20/2489b lim: 320 exec/s: 34 rss: 69Mb L: 113/259 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:21.992 [2024-04-27 06:49:51.802724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0000ffff 00:07:21.992 [2024-04-27 06:49:51.802750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.802813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:21.992 [2024-04-27 06:49:51.802828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.802890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:21.992 [2024-04-27 06:49:51.802904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.992 #35 NEW cov: 11728 ft: 14306 corp: 21/2737b lim: 320 exec/s: 35 rss: 69Mb L: 248/259 MS: 1 ChangeBinInt- 00:07:21.992 [2024-04-27 06:49:51.842932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b5) qid:0 cid:4 nsid:ffffffff cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.992 [2024-04-27 06:49:51.842958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.843036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:5 nsid:44444444 cdw10:ffffffff cdw11:ffffffff 00:07:21.992 [2024-04-27 06:49:51.843050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.992 [2024-04-27 06:49:51.843113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:21.992 [2024-04-27 06:49:51.843137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.992 #36 NEW cov: 11728 ft: 14324 corp: 22/2933b lim: 320 exec/s: 36 rss: 69Mb L: 196/259 MS: 1 ChangeBit- 00:07:21.992 [2024-04-27 06:49:51.882789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:b50000 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x5c5c5c5c5c5c5c5c 00:07:21.992 [2024-04-27 06:49:51.882815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.263 #37 NEW cov: 11728 ft: 14367 corp: 23/3050b lim: 320 exec/s: 37 rss: 69Mb L: 117/259 MS: 1 PersAutoDict- DE: "\377\377\011x"- 00:07:22.263 [2024-04-27 06:49:51.923167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.263 [2024-04-27 06:49:51.923193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.263 [2024-04-27 06:49:51.923249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:22.263 [2024-04-27 06:49:51.923262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.263 [2024-04-27 06:49:51.923316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:a0c2d53d cdw11:007811e3 00:07:22.263 [2024-04-27 06:49:51.923330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.263 #38 NEW cov: 11728 ft: 14391 corp: 24/3255b lim: 320 exec/s: 38 rss: 69Mb L: 205/259 MS: 1 CopyPart- 00:07:22.263 [2024-04-27 06:49:51.963027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:b50000 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x5c5c5c5c5c5c5c5c 00:07:22.263 [2024-04-27 06:49:51.963053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.263 #39 NEW cov: 11728 ft: 14448 corp: 25/3368b lim: 320 exec/s: 39 rss: 69Mb L: 113/259 MS: 1 CrossOver- 00:07:22.263 [2024-04-27 06:49:52.003120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.263 [2024-04-27 06:49:52.003145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.263 #40 NEW cov: 11728 ft: 14459 corp: 26/3474b lim: 320 exec/s: 40 rss: 70Mb L: 106/259 MS: 1 CrossOver- 00:07:22.263 [2024-04-27 06:49:52.043243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:22.263 [2024-04-27 06:49:52.043267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.263 #41 NEW cov: 11728 ft: 14468 corp: 27/3580b lim: 320 exec/s: 41 rss: 70Mb L: 106/259 MS: 1 InsertByte- 00:07:22.263 [2024-04-27 06:49:52.073355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.263 [2024-04-27 06:49:52.073380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.263 #42 NEW cov: 11728 ft: 14484 corp: 28/3686b lim: 320 exec/s: 42 rss: 70Mb L: 106/259 MS: 1 PersAutoDict- DE: "\377\377\011x"- 00:07:22.263 [2024-04-27 06:49:52.113447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:22.263 [2024-04-27 06:49:52.113472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.263 #43 NEW cov: 11728 ft: 14500 corp: 29/3754b lim: 320 exec/s: 43 rss: 70Mb L: 68/259 MS: 1 InsertByte- 00:07:22.263 [2024-04-27 06:49:52.153709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b5) qid:0 cid:4 nsid:ffffffff cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.263 [2024-04-27 06:49:52.153734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.263 [2024-04-27 06:49:52.153794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:22.263 [2024-04-27 06:49:52.153808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.524 #44 NEW cov: 11728 ft: 14619 corp: 30/3915b lim: 320 exec/s: 44 rss: 70Mb L: 161/259 MS: 1 EraseBytes- 00:07:22.524 [2024-04-27 06:49:52.193723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b5) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.524 [2024-04-27 06:49:52.193749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.524 #45 NEW cov: 11728 ft: 14631 corp: 31/4025b lim: 320 exec/s: 45 rss: 70Mb L: 110/259 MS: 1 EraseBytes- 00:07:22.524 [2024-04-27 06:49:52.233847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:22.524 [2024-04-27 06:49:52.233871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.524 #46 NEW cov: 11728 ft: 14666 corp: 32/4150b lim: 320 exec/s: 46 rss: 70Mb L: 125/259 MS: 1 ChangeBit- 00:07:22.524 [2024-04-27 06:49:52.264203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.524 [2024-04-27 06:49:52.264228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.524 [2024-04-27 06:49:52.264283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:22.524 [2024-04-27 06:49:52.264297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.524 [2024-04-27 06:49:52.264353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:a0c2d53d cdw11:007811e3 00:07:22.524 [2024-04-27 06:49:52.264366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.524 #47 NEW cov: 11728 ft: 14696 corp: 33/4355b lim: 320 exec/s: 47 rss: 70Mb L: 205/259 MS: 1 ChangeBinInt- 00:07:22.524 [2024-04-27 06:49:52.304039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:ffff0000 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.524 [2024-04-27 06:49:52.304063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.524 #48 NEW cov: 11728 ft: 14705 corp: 34/4461b lim: 320 exec/s: 48 rss: 70Mb L: 106/259 MS: 1 PersAutoDict- DE: "\377\377\011x"- 00:07:22.524 [2024-04-27 06:49:52.334133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:b50000 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x5c5c5c5c5c5c5c5c 00:07:22.524 [2024-04-27 06:49:52.334158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.524 #49 NEW cov: 11728 ft: 14710 corp: 35/4578b lim: 320 exec/s: 49 rss: 70Mb L: 117/259 MS: 1 ChangeBit- 00:07:22.524 [2024-04-27 06:49:52.374213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:b50000 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x5c5c5c5c5c5c5c5c 00:07:22.524 [2024-04-27 06:49:52.374238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.524 #50 NEW cov: 11728 ft: 14716 corp: 36/4696b lim: 320 exec/s: 50 rss: 70Mb L: 118/259 MS: 1 InsertByte- 00:07:22.524 [2024-04-27 06:49:52.414639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.524 [2024-04-27 06:49:52.414665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.524 [2024-04-27 06:49:52.414721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:22.524 [2024-04-27 06:49:52.414735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.524 [2024-04-27 06:49:52.414794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:00000000 cdw11:fa000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.524 [2024-04-27 06:49:52.414833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.784 #51 NEW cov: 11728 ft: 14790 corp: 37/4917b lim: 320 exec/s: 51 rss: 70Mb L: 221/259 MS: 1 InsertRepeatedBytes- 00:07:22.784 [2024-04-27 06:49:52.454444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b3) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.784 [2024-04-27 06:49:52.454469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.784 #54 NEW cov: 11728 ft: 14796 corp: 38/5007b lim: 320 exec/s: 54 rss: 70Mb L: 90/259 MS: 3 EraseBytes-CMP-InsertRepeatedBytes- DE: "\263)\012\002\000\000\000\000"- 00:07:22.784 [2024-04-27 06:49:52.494598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:b50000 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x5c5c5c5c5c5c5c5c 00:07:22.784 [2024-04-27 06:49:52.494624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.784 #55 NEW cov: 11728 ft: 14833 corp: 39/5125b lim: 320 exec/s: 55 rss: 70Mb L: 118/259 MS: 1 InsertByte- 00:07:22.784 [2024-04-27 06:49:52.534712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:b50000 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x5c5c5c5c5c5c5c5c 00:07:22.784 [2024-04-27 06:49:52.534738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.784 #56 NEW cov: 11728 ft: 14852 corp: 40/5243b lim: 320 exec/s: 56 rss: 70Mb L: 118/259 MS: 1 PersAutoDict- DE: "=\325\302\240\343\021x\000"- 00:07:22.784 [2024-04-27 06:49:52.574853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.784 [2024-04-27 06:49:52.574878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.784 #57 NEW cov: 11728 ft: 14858 corp: 41/5349b lim: 320 exec/s: 57 rss: 70Mb L: 106/259 MS: 1 ChangeBinInt- 00:07:22.784 [2024-04-27 06:49:52.614981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:22.784 [2024-04-27 06:49:52.615006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.784 #58 NEW cov: 11728 ft: 14896 corp: 42/5454b lim: 320 exec/s: 58 rss: 70Mb L: 105/259 MS: 1 CrossOver- 00:07:22.784 [2024-04-27 06:49:52.655079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b5) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.784 [2024-04-27 06:49:52.655104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.784 #59 NEW cov: 11728 ft: 14906 corp: 43/5579b lim: 320 exec/s: 59 rss: 70Mb L: 125/259 MS: 1 ChangeByte- 00:07:23.045 [2024-04-27 06:49:52.695159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:ffff0000 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.045 [2024-04-27 06:49:52.695184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.045 #60 NEW cov: 11728 ft: 14907 corp: 44/5685b lim: 320 exec/s: 60 rss: 70Mb L: 106/259 MS: 1 ChangeByte- 00:07:23.045 [2024-04-27 06:49:52.735516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b5) qid:0 cid:4 nsid:ffffffff cdw10:44444444 cdw11:00444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.045 [2024-04-27 06:49:52.735541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.045 [2024-04-27 06:49:52.735603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:5 nsid:44444444 cdw10:ffffffff cdw11:ffffffff 00:07:23.045 [2024-04-27 06:49:52.735619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.045 [2024-04-27 06:49:52.735682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.045 [2024-04-27 06:49:52.735696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.045 #61 NEW cov: 11728 ft: 14919 corp: 45/5881b lim: 320 exec/s: 61 rss: 70Mb L: 196/259 MS: 1 ChangeBinInt- 00:07:23.045 [2024-04-27 06:49:52.775380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f2) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.045 [2024-04-27 06:49:52.775408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.045 #62 NEW cov: 11728 ft: 14935 corp: 46/5987b lim: 320 exec/s: 31 rss: 70Mb L: 106/259 MS: 1 CopyPart- 00:07:23.045 #62 DONE cov: 11728 ft: 14935 corp: 46/5987b lim: 320 exec/s: 31 rss: 70Mb 00:07:23.045 ###### Recommended dictionary. ###### 00:07:23.045 "=\325\302\240\343\021x\000" # Uses: 1 00:07:23.045 "\377\377\011x" # Uses: 3 00:07:23.045 "\263)\012\002\000\000\000\000" # Uses: 0 00:07:23.045 ###### End of recommended dictionary. ###### 00:07:23.045 Done 62 runs in 2 second(s) 00:07:23.045 06:49:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:23.045 06:49:52 -- ../common.sh@72 -- # (( i++ )) 00:07:23.045 06:49:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:23.045 06:49:52 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:23.045 06:49:52 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:23.045 06:49:52 -- nvmf/run.sh@24 -- # local timen=1 00:07:23.045 06:49:52 -- nvmf/run.sh@25 -- # local core=0x1 00:07:23.045 06:49:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:23.045 06:49:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:23.045 06:49:52 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:23.045 06:49:52 -- nvmf/run.sh@29 -- # port=4401 00:07:23.045 06:49:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:23.045 06:49:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:23.045 06:49:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:23.045 06:49:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:23.304 [2024-04-27 06:49:52.949916] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:23.304 [2024-04-27 06:49:52.950011] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2618856 ] 00:07:23.304 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.304 [2024-04-27 06:49:53.133553] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.304 [2024-04-27 06:49:53.152954] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:23.304 [2024-04-27 06:49:53.153096] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.563 [2024-04-27 06:49:53.204512] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:23.563 [2024-04-27 06:49:53.220828] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:23.563 INFO: Running with entropic power schedule (0xFF, 100). 00:07:23.563 INFO: Seed: 407465652 00:07:23.563 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:23.563 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:23.563 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:23.563 INFO: A corpus is not provided, starting from an empty corpus 00:07:23.563 #2 INITED exec/s: 0 rss: 59Mb 00:07:23.563 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:23.563 This may also happen if the target rejected all inputs we tried so far 00:07:23.563 [2024-04-27 06:49:53.265774] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:23.563 [2024-04-27 06:49:53.265985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.563 [2024-04-27 06:49:53.266015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.822 NEW_FUNC[1/661]: 0x49dee0 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:23.822 NEW_FUNC[2/661]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:23.822 #12 NEW cov: 11489 ft: 11532 corp: 2/10b lim: 30 exec/s: 0 rss: 67Mb L: 9/9 MS: 5 InsertByte-ChangeBit-CopyPart-ChangeByte-InsertRepeatedBytes- 00:07:23.822 [2024-04-27 06:49:53.566561] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (52224) > buf size (4096) 00:07:23.822 [2024-04-27 06:49:53.566685] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:23.822 [2024-04-27 06:49:53.566900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff00dc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.822 [2024-04-27 06:49:53.566939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.822 [2024-04-27 06:49:53.567002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.822 [2024-04-27 06:49:53.567021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.822 NEW_FUNC[1/3]: 0x1972290 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:07:23.822 NEW_FUNC[2/3]: 0x19739e0 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:864 00:07:23.822 #18 NEW cov: 11667 ft: 12483 corp: 3/23b lim: 30 exec/s: 0 rss: 67Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:07:23.822 [2024-04-27 06:49:53.616562] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:23.822 [2024-04-27 06:49:53.616755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.822 [2024-04-27 06:49:53.616780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.822 #24 NEW cov: 11673 ft: 12644 corp: 4/34b lim: 30 exec/s: 0 rss: 67Mb L: 11/13 MS: 1 CopyPart- 00:07:23.822 [2024-04-27 06:49:53.656681] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:23.822 [2024-04-27 06:49:53.656874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.822 [2024-04-27 06:49:53.656898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.822 #25 NEW cov: 11758 ft: 12935 corp: 5/44b lim: 30 exec/s: 0 rss: 67Mb L: 10/13 MS: 1 InsertByte- 00:07:23.822 [2024-04-27 06:49:53.696839] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:23.822 [2024-04-27 06:49:53.696949] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:23.822 [2024-04-27 06:49:53.697143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.822 [2024-04-27 06:49:53.697172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.822 [2024-04-27 06:49:53.697226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.822 [2024-04-27 06:49:53.697240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.822 #26 NEW cov: 11758 ft: 13034 corp: 6/58b lim: 30 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 CopyPart- 00:07:24.081 [2024-04-27 06:49:53.736927] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (52224) > buf size (4096) 00:07:24.081 [2024-04-27 06:49:53.737038] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.081 [2024-04-27 06:49:53.737233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff00dc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.081 [2024-04-27 06:49:53.737258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.081 [2024-04-27 06:49:53.737313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.081 [2024-04-27 06:49:53.737327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.081 #27 NEW cov: 11758 ft: 13156 corp: 7/71b lim: 30 exec/s: 0 rss: 67Mb L: 13/14 MS: 1 CrossOver- 00:07:24.081 [2024-04-27 06:49:53.777081] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (52224) > buf size (4096) 00:07:24.081 [2024-04-27 06:49:53.777187] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:07:24.082 [2024-04-27 06:49:53.777387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff00dc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.777418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.082 [2024-04-27 06:49:53.777469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.777482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.082 #28 NEW cov: 11758 ft: 13191 corp: 8/84b lim: 30 exec/s: 0 rss: 67Mb L: 13/14 MS: 1 ChangeBinInt- 00:07:24.082 [2024-04-27 06:49:53.817161] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.082 [2024-04-27 06:49:53.817367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.817391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.082 #29 NEW cov: 11758 ft: 13203 corp: 9/93b lim: 30 exec/s: 0 rss: 68Mb L: 9/14 MS: 1 ShuffleBytes- 00:07:24.082 [2024-04-27 06:49:53.847241] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.082 [2024-04-27 06:49:53.847437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.847462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.082 #30 NEW cov: 11758 ft: 13264 corp: 10/104b lim: 30 exec/s: 0 rss: 68Mb L: 11/14 MS: 1 ShuffleBytes- 00:07:24.082 [2024-04-27 06:49:53.887374] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.082 [2024-04-27 06:49:53.887489] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.082 [2024-04-27 06:49:53.887684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.887712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.082 [2024-04-27 06:49:53.887762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.887776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.082 #31 NEW cov: 11758 ft: 13312 corp: 11/118b lim: 30 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 ShuffleBytes- 00:07:24.082 [2024-04-27 06:49:53.927493] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.082 [2024-04-27 06:49:53.927603] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:24.082 [2024-04-27 06:49:53.927807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.927832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.082 [2024-04-27 06:49:53.927884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.927898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.082 #32 NEW cov: 11758 ft: 13327 corp: 12/130b lim: 30 exec/s: 0 rss: 68Mb L: 12/14 MS: 1 InsertByte- 00:07:24.082 [2024-04-27 06:49:53.967621] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.082 [2024-04-27 06:49:53.967822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.082 [2024-04-27 06:49:53.967846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.342 #38 NEW cov: 11758 ft: 13441 corp: 13/141b lim: 30 exec/s: 0 rss: 68Mb L: 11/14 MS: 1 ChangeBit- 00:07:24.342 [2024-04-27 06:49:54.007746] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.342 [2024-04-27 06:49:54.007852] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.342 [2024-04-27 06:49:54.008041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.008065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.342 [2024-04-27 06:49:54.008119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.008133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.342 #39 NEW cov: 11758 ft: 13482 corp: 14/154b lim: 30 exec/s: 0 rss: 68Mb L: 13/14 MS: 1 CrossOver- 00:07:24.342 [2024-04-27 06:49:54.047847] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (52064) > buf size (4096) 00:07:24.342 [2024-04-27 06:49:54.047961] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.342 [2024-04-27 06:49:54.048156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32d700ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.048181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.342 [2024-04-27 06:49:54.048233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:dcff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.048251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.342 #40 NEW cov: 11758 ft: 13502 corp: 15/168b lim: 30 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 InsertByte- 00:07:24.342 [2024-04-27 06:49:54.087985] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.342 [2024-04-27 06:49:54.088097] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.342 [2024-04-27 06:49:54.088305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ccff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.088330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.342 [2024-04-27 06:49:54.088382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.088399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.342 #41 NEW cov: 11758 ft: 13538 corp: 16/182b lim: 30 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 ChangeByte- 00:07:24.342 [2024-04-27 06:49:54.128045] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.342 [2024-04-27 06:49:54.128245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.128269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.342 #42 NEW cov: 11758 ft: 13560 corp: 17/191b lim: 30 exec/s: 0 rss: 68Mb L: 9/14 MS: 1 ShuffleBytes- 00:07:24.342 [2024-04-27 06:49:54.168199] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.342 [2024-04-27 06:49:54.168403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.168427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.342 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:24.342 #43 NEW cov: 11781 ft: 13592 corp: 18/198b lim: 30 exec/s: 0 rss: 68Mb L: 7/14 MS: 1 CrossOver- 00:07:24.342 [2024-04-27 06:49:54.208273] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.342 [2024-04-27 06:49:54.208478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ab283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.342 [2024-04-27 06:49:54.208502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.342 #44 NEW cov: 11781 ft: 13663 corp: 19/205b lim: 30 exec/s: 0 rss: 69Mb L: 7/14 MS: 1 ChangeBit- 00:07:24.603 [2024-04-27 06:49:54.248428] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.603 [2024-04-27 06:49:54.248630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.248654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.603 #45 NEW cov: 11781 ft: 13730 corp: 20/214b lim: 30 exec/s: 45 rss: 69Mb L: 9/14 MS: 1 ChangeBinInt- 00:07:24.603 [2024-04-27 06:49:54.288537] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.603 [2024-04-27 06:49:54.288645] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.603 [2024-04-27 06:49:54.288836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:dcdc83dc cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.288862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.603 [2024-04-27 06:49:54.288918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.288932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.603 #46 NEW cov: 11781 ft: 13746 corp: 21/231b lim: 30 exec/s: 46 rss: 69Mb L: 17/17 MS: 1 CrossOver- 00:07:24.603 [2024-04-27 06:49:54.328609] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (797388) > buf size (4096) 00:07:24.603 [2024-04-27 06:49:54.328820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ab283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.328845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.603 #47 NEW cov: 11781 ft: 13761 corp: 22/238b lim: 30 exec/s: 47 rss: 69Mb L: 7/17 MS: 1 ChangeBinInt- 00:07:24.603 [2024-04-27 06:49:54.368750] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.603 [2024-04-27 06:49:54.368952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.368976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.603 #53 NEW cov: 11781 ft: 13810 corp: 23/249b lim: 30 exec/s: 53 rss: 69Mb L: 11/17 MS: 1 CMP- DE: "\001\000"- 00:07:24.603 [2024-04-27 06:49:54.408858] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.603 [2024-04-27 06:49:54.409059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.409093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.603 #54 NEW cov: 11781 ft: 13819 corp: 24/260b lim: 30 exec/s: 54 rss: 69Mb L: 11/17 MS: 1 ChangeByte- 00:07:24.603 [2024-04-27 06:49:54.449013] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (51208) > buf size (4096) 00:07:24.603 [2024-04-27 06:49:54.449215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.449239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.603 #55 NEW cov: 11781 ft: 13822 corp: 25/269b lim: 30 exec/s: 55 rss: 69Mb L: 9/17 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:07:24.603 [2024-04-27 06:49:54.479124] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.603 [2024-04-27 06:49:54.479237] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:24.603 [2024-04-27 06:49:54.479431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.479457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.603 [2024-04-27 06:49:54.479511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.603 [2024-04-27 06:49:54.479525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.863 #56 NEW cov: 11781 ft: 13866 corp: 26/281b lim: 30 exec/s: 56 rss: 69Mb L: 12/17 MS: 1 CrossOver- 00:07:24.863 [2024-04-27 06:49:54.519206] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (824524) > buf size (4096) 00:07:24.863 [2024-04-27 06:49:54.519408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:253283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.863 [2024-04-27 06:49:54.519436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.863 #59 NEW cov: 11781 ft: 13880 corp: 27/287b lim: 30 exec/s: 59 rss: 69Mb L: 6/17 MS: 3 EraseBytes-ChangeByte-InsertByte- 00:07:24.863 [2024-04-27 06:49:54.549307] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000132 00:07:24.863 [2024-04-27 06:49:54.549510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:25ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.863 [2024-04-27 06:49:54.549536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.863 #60 NEW cov: 11781 ft: 13943 corp: 28/297b lim: 30 exec/s: 60 rss: 69Mb L: 10/17 MS: 1 CrossOver- 00:07:24.863 [2024-04-27 06:49:54.589412] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.863 [2024-04-27 06:49:54.589613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.863 [2024-04-27 06:49:54.589644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.864 #61 NEW cov: 11781 ft: 13965 corp: 29/306b lim: 30 exec/s: 61 rss: 69Mb L: 9/17 MS: 1 CopyPart- 00:07:24.864 [2024-04-27 06:49:54.629540] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (54280) > buf size (4096) 00:07:24.864 [2024-04-27 06:49:54.629739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:35010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.864 [2024-04-27 06:49:54.629763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.864 #62 NEW cov: 11781 ft: 13993 corp: 30/315b lim: 30 exec/s: 62 rss: 69Mb L: 9/17 MS: 1 ChangeByte- 00:07:24.864 [2024-04-27 06:49:54.669885] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:07:24.864 [2024-04-27 06:49:54.669995] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff3f 00:07:24.864 [2024-04-27 06:49:54.670191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.864 [2024-04-27 06:49:54.670217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.864 [2024-04-27 06:49:54.670269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.864 [2024-04-27 06:49:54.670283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.864 [2024-04-27 06:49:54.670335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.864 [2024-04-27 06:49:54.670349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.864 [2024-04-27 06:49:54.670404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.864 [2024-04-27 06:49:54.670418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.864 #63 NEW cov: 11798 ft: 14572 corp: 31/340b lim: 30 exec/s: 63 rss: 70Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:24.864 [2024-04-27 06:49:54.709799] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (52224) > buf size (4096) 00:07:24.864 [2024-04-27 06:49:54.710001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff00dc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.864 [2024-04-27 06:49:54.710028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.864 #64 NEW cov: 11798 ft: 14583 corp: 32/351b lim: 30 exec/s: 64 rss: 70Mb L: 11/25 MS: 1 EraseBytes- 00:07:24.864 [2024-04-27 06:49:54.749928] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.864 [2024-04-27 06:49:54.750039] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:24.864 [2024-04-27 06:49:54.750248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.864 [2024-04-27 06:49:54.750274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.864 [2024-04-27 06:49:54.750326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.864 [2024-04-27 06:49:54.750340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.124 #65 NEW cov: 11798 ft: 14623 corp: 33/368b lim: 30 exec/s: 65 rss: 70Mb L: 17/25 MS: 1 CrossOver- 00:07:25.124 [2024-04-27 06:49:54.789984] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aff 00:07:25.124 [2024-04-27 06:49:54.790199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.790224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.124 #66 NEW cov: 11798 ft: 14629 corp: 34/376b lim: 30 exec/s: 66 rss: 70Mb L: 8/25 MS: 1 CopyPart- 00:07:25.124 [2024-04-27 06:49:54.830118] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.124 [2024-04-27 06:49:54.830329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.830360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.124 #67 NEW cov: 11798 ft: 14633 corp: 35/386b lim: 30 exec/s: 67 rss: 70Mb L: 10/25 MS: 1 InsertByte- 00:07:25.124 [2024-04-27 06:49:54.860334] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.124 [2024-04-27 06:49:54.860560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.860601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.124 [2024-04-27 06:49:54.860657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000183ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.860672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.124 #68 NEW cov: 11798 ft: 14655 corp: 36/400b lim: 30 exec/s: 68 rss: 70Mb L: 14/25 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:07:25.124 [2024-04-27 06:49:54.900335] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.124 [2024-04-27 06:49:54.900448] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261600) > buf size (4096) 00:07:25.124 [2024-04-27 06:49:54.900657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ab283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.900683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.124 [2024-04-27 06:49:54.900736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff770011 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.900753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.124 #69 NEW cov: 11798 ft: 14663 corp: 37/415b lim: 30 exec/s: 69 rss: 70Mb L: 15/25 MS: 1 CMP- DE: "\377w\021\340.\244B\274"- 00:07:25.124 [2024-04-27 06:49:54.940462] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.124 [2024-04-27 06:49:54.940571] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:25.124 [2024-04-27 06:49:54.940764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.940790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.124 [2024-04-27 06:49:54.940844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.940859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.124 #70 NEW cov: 11798 ft: 14670 corp: 38/427b lim: 30 exec/s: 70 rss: 70Mb L: 12/25 MS: 1 ShuffleBytes- 00:07:25.124 [2024-04-27 06:49:54.980555] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aff 00:07:25.124 [2024-04-27 06:49:54.980757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a1283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.124 [2024-04-27 06:49:54.980781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.124 #71 NEW cov: 11798 ft: 14675 corp: 39/435b lim: 30 exec/s: 71 rss: 70Mb L: 8/25 MS: 1 ChangeBit- 00:07:25.384 [2024-04-27 06:49:55.020653] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aff 00:07:25.384 [2024-04-27 06:49:55.020879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:fd1283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.384 [2024-04-27 06:49:55.020905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.384 #72 NEW cov: 11798 ft: 14679 corp: 40/443b lim: 30 exec/s: 72 rss: 70Mb L: 8/25 MS: 1 ChangeBinInt- 00:07:25.384 [2024-04-27 06:49:55.060772] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.384 [2024-04-27 06:49:55.060897] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.384 [2024-04-27 06:49:55.061086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.384 [2024-04-27 06:49:55.061112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.384 [2024-04-27 06:49:55.061168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.384 [2024-04-27 06:49:55.061183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.384 #73 NEW cov: 11798 ft: 14706 corp: 41/457b lim: 30 exec/s: 73 rss: 70Mb L: 14/25 MS: 1 CopyPart- 00:07:25.384 [2024-04-27 06:49:55.100965] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.384 [2024-04-27 06:49:55.101071] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.384 [2024-04-27 06:49:55.101187] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:25.384 [2024-04-27 06:49:55.101382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.384 [2024-04-27 06:49:55.101412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.384 [2024-04-27 06:49:55.101466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.384 [2024-04-27 06:49:55.101481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.384 [2024-04-27 06:49:55.101532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.384 [2024-04-27 06:49:55.101546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.384 #74 NEW cov: 11798 ft: 14926 corp: 42/476b lim: 30 exec/s: 74 rss: 70Mb L: 19/25 MS: 1 InsertRepeatedBytes- 00:07:25.384 [2024-04-27 06:49:55.140988] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff01 00:07:25.384 [2024-04-27 06:49:55.141194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2b2583ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.385 [2024-04-27 06:49:55.141218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.385 #75 NEW cov: 11798 ft: 14937 corp: 43/487b lim: 30 exec/s: 75 rss: 70Mb L: 11/25 MS: 1 InsertByte- 00:07:25.385 [2024-04-27 06:49:55.181173] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (52224) > buf size (4096) 00:07:25.385 [2024-04-27 06:49:55.181376] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.385 [2024-04-27 06:49:55.181584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff00dc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.385 [2024-04-27 06:49:55.181611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.385 [2024-04-27 06:49:55.181663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.385 [2024-04-27 06:49:55.181677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.385 [2024-04-27 06:49:55.181730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:dcff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.385 [2024-04-27 06:49:55.181744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.385 #76 NEW cov: 11798 ft: 14939 corp: 44/507b lim: 30 exec/s: 76 rss: 70Mb L: 20/25 MS: 1 InsertRepeatedBytes- 00:07:25.385 [2024-04-27 06:49:55.221178] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (48136) > buf size (4096) 00:07:25.385 [2024-04-27 06:49:55.221379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2f010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.385 [2024-04-27 06:49:55.221408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.385 #77 NEW cov: 11798 ft: 14953 corp: 45/516b lim: 30 exec/s: 77 rss: 70Mb L: 9/25 MS: 1 ChangeBinInt- 00:07:25.385 [2024-04-27 06:49:55.261339] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.385 [2024-04-27 06:49:55.261472] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.385 [2024-04-27 06:49:55.261684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.385 [2024-04-27 06:49:55.261711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.385 [2024-04-27 06:49:55.261766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.385 [2024-04-27 06:49:55.261783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.645 #78 NEW cov: 11798 ft: 14960 corp: 46/533b lim: 30 exec/s: 39 rss: 70Mb L: 17/25 MS: 1 ChangeBit- 00:07:25.645 #78 DONE cov: 11798 ft: 14960 corp: 46/533b lim: 30 exec/s: 39 rss: 70Mb 00:07:25.645 ###### Recommended dictionary. ###### 00:07:25.645 "\001\000" # Uses: 0 00:07:25.645 "\001\000\000\000\000\000\000\001" # Uses: 1 00:07:25.645 "\377w\021\340.\244B\274" # Uses: 0 00:07:25.645 ###### End of recommended dictionary. ###### 00:07:25.645 Done 78 runs in 2 second(s) 00:07:25.645 06:49:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:25.645 06:49:55 -- ../common.sh@72 -- # (( i++ )) 00:07:25.645 06:49:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:25.645 06:49:55 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:25.645 06:49:55 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:25.645 06:49:55 -- nvmf/run.sh@24 -- # local timen=1 00:07:25.645 06:49:55 -- nvmf/run.sh@25 -- # local core=0x1 00:07:25.645 06:49:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:25.645 06:49:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:25.645 06:49:55 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:25.645 06:49:55 -- nvmf/run.sh@29 -- # port=4402 00:07:25.645 06:49:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:25.645 06:49:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:25.645 06:49:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:25.645 06:49:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:25.645 [2024-04-27 06:49:55.442727] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:25.645 [2024-04-27 06:49:55.442806] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2619390 ] 00:07:25.645 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.904 [2024-04-27 06:49:55.620163] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.904 [2024-04-27 06:49:55.639566] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:25.904 [2024-04-27 06:49:55.639693] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.904 [2024-04-27 06:49:55.691317] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:25.904 [2024-04-27 06:49:55.707642] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:25.904 INFO: Running with entropic power schedule (0xFF, 100). 00:07:25.904 INFO: Seed: 2894468949 00:07:25.904 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:25.904 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:25.904 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:25.904 INFO: A corpus is not provided, starting from an empty corpus 00:07:25.904 #2 INITED exec/s: 0 rss: 59Mb 00:07:25.904 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:25.904 This may also happen if the target rejected all inputs we tried so far 00:07:25.904 [2024-04-27 06:49:55.752900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.904 [2024-04-27 06:49:55.752930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.904 [2024-04-27 06:49:55.752984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.904 [2024-04-27 06:49:55.753000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.163 NEW_FUNC[1/662]: 0x4a0900 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:26.163 NEW_FUNC[2/662]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:26.163 #8 NEW cov: 11480 ft: 11490 corp: 2/20b lim: 35 exec/s: 0 rss: 66Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:26.423 [2024-04-27 06:49:56.063760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.063794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.423 [2024-04-27 06:49:56.063849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d80098d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.063863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.423 NEW_FUNC[1/1]: 0x1260340 in nvmf_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/nvmf.c:153 00:07:26.423 #9 NEW cov: 11602 ft: 11908 corp: 3/39b lim: 35 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 ChangeBit- 00:07:26.423 [2024-04-27 06:49:56.113701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0a00d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.113728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.423 #13 NEW cov: 11608 ft: 12548 corp: 4/46b lim: 35 exec/s: 0 rss: 67Mb L: 7/19 MS: 4 CopyPart-ShuffleBytes-ShuffleBytes-CrossOver- 00:07:26.423 [2024-04-27 06:49:56.153942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.153968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.423 [2024-04-27 06:49:56.154026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.154040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.423 #14 NEW cov: 11693 ft: 12809 corp: 5/66b lim: 35 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 InsertByte- 00:07:26.423 [2024-04-27 06:49:56.193953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0a00d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.193980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.423 #15 NEW cov: 11693 ft: 12951 corp: 6/76b lim: 35 exec/s: 0 rss: 67Mb L: 10/20 MS: 1 InsertRepeatedBytes- 00:07:26.423 [2024-04-27 06:49:56.234068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800a1 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.234095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.423 #18 NEW cov: 11693 ft: 13098 corp: 7/87b lim: 35 exec/s: 0 rss: 67Mb L: 11/20 MS: 3 ShuffleBytes-ChangeByte-CrossOver- 00:07:26.423 [2024-04-27 06:49:56.274173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:c8d800d8 cdw11:0a00d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.274199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.423 #19 NEW cov: 11693 ft: 13168 corp: 8/94b lim: 35 exec/s: 0 rss: 67Mb L: 7/20 MS: 1 ChangeBit- 00:07:26.423 [2024-04-27 06:49:56.314422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.314448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.423 [2024-04-27 06:49:56.314525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.423 [2024-04-27 06:49:56.314539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.683 #20 NEW cov: 11693 ft: 13205 corp: 9/110b lim: 35 exec/s: 0 rss: 67Mb L: 16/20 MS: 1 EraseBytes- 00:07:26.683 [2024-04-27 06:49:56.344540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0ad800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.683 [2024-04-27 06:49:56.344567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.683 [2024-04-27 06:49:56.344625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d80098d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.683 [2024-04-27 06:49:56.344640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.683 #21 NEW cov: 11693 ft: 13293 corp: 10/129b lim: 35 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 ShuffleBytes- 00:07:26.683 [2024-04-27 06:49:56.384785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a4c00fb cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.683 [2024-04-27 06:49:56.384811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.683 [2024-04-27 06:49:56.384869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:4c4c004c cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.683 [2024-04-27 06:49:56.384883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.684 [2024-04-27 06:49:56.384940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:4c4c004c cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.384954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.684 #28 NEW cov: 11693 ft: 13638 corp: 11/150b lim: 35 exec/s: 0 rss: 68Mb L: 21/21 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:26.684 [2024-04-27 06:49:56.425047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.425074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.684 [2024-04-27 06:49:56.425131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.425146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.684 [2024-04-27 06:49:56.425202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.425216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.684 [2024-04-27 06:49:56.425274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d8001cd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.425288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.684 #29 NEW cov: 11693 ft: 14230 corp: 12/184b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:07:26.684 [2024-04-27 06:49:56.465264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a4c00fb cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.465290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.684 [2024-04-27 06:49:56.465346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:4c4c004c cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.465360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.684 [2024-04-27 06:49:56.465419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:4c4c004c cdw11:99009999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.465433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.684 [2024-04-27 06:49:56.465490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:99990099 cdw11:99009999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.465505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.684 [2024-04-27 06:49:56.465562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:99990099 cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.465575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.684 #30 NEW cov: 11693 ft: 14322 corp: 13/219b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:26.684 [2024-04-27 06:49:56.504842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0a00d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.504868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.684 #31 NEW cov: 11693 ft: 14390 corp: 14/231b lim: 35 exec/s: 0 rss: 69Mb L: 12/35 MS: 1 CrossOver- 00:07:26.684 [2024-04-27 06:49:56.544975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0a00d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.684 [2024-04-27 06:49:56.545001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.684 #32 NEW cov: 11693 ft: 14456 corp: 15/241b lim: 35 exec/s: 0 rss: 69Mb L: 10/35 MS: 1 CopyPart- 00:07:26.944 [2024-04-27 06:49:56.585101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:c83500d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.585127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.944 #33 NEW cov: 11693 ft: 14496 corp: 16/249b lim: 35 exec/s: 0 rss: 69Mb L: 8/35 MS: 1 InsertByte- 00:07:26.944 [2024-04-27 06:49:56.625491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:84b40084 cdw11:b400b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.625516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.625576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b4b400b4 cdw11:b400b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.625589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.625649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b4b400b4 cdw11:b400b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.625666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.944 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:26.944 #36 NEW cov: 11716 ft: 14520 corp: 17/273b lim: 35 exec/s: 0 rss: 69Mb L: 24/35 MS: 3 InsertByte-CopyPart-InsertRepeatedBytes- 00:07:26.944 [2024-04-27 06:49:56.665718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.665743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.665801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.665815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.665872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d0d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.665885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.665941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d8001cd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.665954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.944 #37 NEW cov: 11716 ft: 14538 corp: 18/307b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:07:26.944 [2024-04-27 06:49:56.705600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.705625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.705700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.705714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.944 #38 NEW cov: 11716 ft: 14540 corp: 19/321b lim: 35 exec/s: 0 rss: 69Mb L: 14/35 MS: 1 EraseBytes- 00:07:26.944 [2024-04-27 06:49:56.746141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.746166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.746225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d80025d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.746239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.746293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d8d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.746308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.746363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.746376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.746435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.746452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.944 #39 NEW cov: 11716 ft: 14565 corp: 20/356b lim: 35 exec/s: 39 rss: 69Mb L: 35/35 MS: 1 InsertByte- 00:07:26.944 [2024-04-27 06:49:56.785958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a4c00fb cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.785983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.786043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:4c4c004c cdw11:4c004c46 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.786057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.786114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:4c4c004c cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.786127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.944 #40 NEW cov: 11716 ft: 14581 corp: 21/377b lim: 35 exec/s: 40 rss: 69Mb L: 21/35 MS: 1 ChangeBinInt- 00:07:26.944 [2024-04-27 06:49:56.826331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.826356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.826418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d80025d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.826432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.826489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d80000d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.826503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.826559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.826573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.944 [2024-04-27 06:49:56.826627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.944 [2024-04-27 06:49:56.826640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.204 #41 NEW cov: 11716 ft: 14615 corp: 22/412b lim: 35 exec/s: 41 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:27.204 [2024-04-27 06:49:56.866157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a4c00fb cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.204 [2024-04-27 06:49:56.866182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.204 [2024-04-27 06:49:56.866241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9999004c cdw11:99009999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.204 [2024-04-27 06:49:56.866255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.204 [2024-04-27 06:49:56.866327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:99990099 cdw11:4c00994c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.204 [2024-04-27 06:49:56.866344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.204 #42 NEW cov: 11716 ft: 14622 corp: 23/434b lim: 35 exec/s: 42 rss: 70Mb L: 22/35 MS: 1 EraseBytes- 00:07:27.204 [2024-04-27 06:49:56.906438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.204 [2024-04-27 06:49:56.906463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.204 [2024-04-27 06:49:56.906523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.204 [2024-04-27 06:49:56.906537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:56.906596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d0d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:56.906609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:56.906668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d8001cd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:56.906681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.205 #43 NEW cov: 11716 ft: 14644 corp: 24/468b lim: 35 exec/s: 43 rss: 70Mb L: 34/35 MS: 1 ShuffleBytes- 00:07:27.205 [2024-04-27 06:49:56.946535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:c8f800d8 cdw11:f800f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:56.946560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:56.946616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f8f800f8 cdw11:f800f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:56.946630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:56.946689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:f8f800f8 cdw11:f800f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:56.946702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:56.946759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:f8f800f8 cdw11:f800f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:56.946772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.205 #44 NEW cov: 11716 ft: 14657 corp: 25/501b lim: 35 exec/s: 44 rss: 70Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:07:27.205 [2024-04-27 06:49:56.986236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0000d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:56.986261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.205 #45 NEW cov: 11716 ft: 14672 corp: 26/512b lim: 35 exec/s: 45 rss: 70Mb L: 11/35 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:27.205 [2024-04-27 06:49:57.026773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:c8f800d8 cdw11:f800f882 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.026798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:57.026862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f8f800f8 cdw11:f800f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.026875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:57.026932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:f8f800f8 cdw11:f800f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.026946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:57.027003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:f8f800f8 cdw11:f800f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.027017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.205 #46 NEW cov: 11716 ft: 14718 corp: 27/545b lim: 35 exec/s: 46 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:07:27.205 [2024-04-27 06:49:57.067055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.067081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:57.067140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d89800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.067154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:57.067209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d8d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.067223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:57.067278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.067292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.205 [2024-04-27 06:49:57.067349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.205 [2024-04-27 06:49:57.067363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.205 #47 NEW cov: 11716 ft: 14725 corp: 28/580b lim: 35 exec/s: 47 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:07:27.465 [2024-04-27 06:49:57.107024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff006b cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.107050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.107110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.107124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.107181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.107195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.107253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.107269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.465 #49 NEW cov: 11716 ft: 14733 corp: 29/610b lim: 35 exec/s: 49 rss: 70Mb L: 30/35 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:27.465 [2024-04-27 06:49:57.146695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0a000cd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.146720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.465 #50 NEW cov: 11716 ft: 14741 corp: 30/622b lim: 35 exec/s: 50 rss: 70Mb L: 12/35 MS: 1 ChangeBinInt- 00:07:27.465 [2024-04-27 06:49:57.186859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0000d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.186885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.465 #51 NEW cov: 11716 ft: 14763 corp: 31/633b lim: 35 exec/s: 51 rss: 70Mb L: 11/35 MS: 1 ChangeBit- 00:07:27.465 [2024-04-27 06:49:57.226921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:35c800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.226947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.465 #52 NEW cov: 11716 ft: 14827 corp: 32/641b lim: 35 exec/s: 52 rss: 70Mb L: 8/35 MS: 1 ShuffleBytes- 00:07:27.465 [2024-04-27 06:49:57.267189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0ad800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.267214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.267275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d8009ad8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.267289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.465 #53 NEW cov: 11716 ft: 14844 corp: 33/660b lim: 35 exec/s: 53 rss: 70Mb L: 19/35 MS: 1 ChangeBit- 00:07:27.465 [2024-04-27 06:49:57.307614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.307638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.307694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.307708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.307763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d0d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.307776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.307829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d8001cd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.307843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.465 #54 NEW cov: 11716 ft: 14894 corp: 34/694b lim: 35 exec/s: 54 rss: 70Mb L: 34/35 MS: 1 ChangeBinInt- 00:07:27.465 [2024-04-27 06:49:57.347874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.347902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.347959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d8003fd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.347973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.348028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d8d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.348042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.348099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.465 [2024-04-27 06:49:57.348112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.465 [2024-04-27 06:49:57.348169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.466 [2024-04-27 06:49:57.348182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.726 #55 NEW cov: 11716 ft: 14938 corp: 35/729b lim: 35 exec/s: 55 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:07:27.726 [2024-04-27 06:49:57.387774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.387797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.387855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d80025d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.387870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.387927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d8d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.387941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.387997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.388010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.388066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:d8d800d8 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.388080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.726 #56 NEW cov: 11716 ft: 14970 corp: 36/764b lim: 35 exec/s: 56 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:07:27.726 [2024-04-27 06:49:57.427940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff006b cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.427965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.428039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.428053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.428110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:03ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.428125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.428180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.428193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.726 #57 NEW cov: 11716 ft: 15012 corp: 37/795b lim: 35 exec/s: 57 rss: 70Mb L: 31/35 MS: 1 InsertByte- 00:07:27.726 [2024-04-27 06:49:57.467621] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:27.726 [2024-04-27 06:49:57.467854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0000d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.467881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.467940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.467957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.726 #58 NEW cov: 11725 ft: 15037 corp: 38/810b lim: 35 exec/s: 58 rss: 70Mb L: 15/35 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:27.726 [2024-04-27 06:49:57.507958] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:27.726 [2024-04-27 06:49:57.508201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.508226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.508286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.508300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.508358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d80000d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.508372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.508426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:d8001cd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.508442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.726 #59 NEW cov: 11725 ft: 15048 corp: 39/844b lim: 35 exec/s: 59 rss: 70Mb L: 34/35 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:27.726 [2024-04-27 06:49:57.548102] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:27.726 [2024-04-27 06:49:57.548331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.548356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.548415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.548430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.548491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d80000d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.548505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.726 [2024-04-27 06:49:57.548563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00001c01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.548578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.726 #60 NEW cov: 11725 ft: 15052 corp: 40/878b lim: 35 exec/s: 60 rss: 70Mb L: 34/35 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\002"- 00:07:27.726 [2024-04-27 06:49:57.588018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0a000cd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.726 [2024-04-27 06:49:57.588044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.726 #61 NEW cov: 11725 ft: 15072 corp: 41/890b lim: 35 exec/s: 61 rss: 70Mb L: 12/35 MS: 1 ChangeByte- 00:07:27.986 [2024-04-27 06:49:57.628445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a4c00fb cdw11:4c004c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.628472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.986 [2024-04-27 06:49:57.628546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:4c4c004c cdw11:99009999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.628561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.986 [2024-04-27 06:49:57.628617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:99990099 cdw11:99009999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.628631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.986 #62 NEW cov: 11725 ft: 15145 corp: 42/914b lim: 35 exec/s: 62 rss: 70Mb L: 24/35 MS: 1 CopyPart- 00:07:27.986 [2024-04-27 06:49:57.668384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:0a00d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.668415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.986 [2024-04-27 06:49:57.668474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8c800c8 cdw11:b500f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.668488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.986 #63 NEW cov: 11725 ft: 15160 corp: 43/931b lim: 35 exec/s: 63 rss: 70Mb L: 17/35 MS: 1 CrossOver- 00:07:27.986 [2024-04-27 06:49:57.708377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d800d8 cdw11:83000cd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.708408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.986 #64 NEW cov: 11725 ft: 15174 corp: 44/943b lim: 35 exec/s: 64 rss: 70Mb L: 12/35 MS: 1 ChangeByte- 00:07:27.986 [2024-04-27 06:49:57.749050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d8d8000a cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.749076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.986 [2024-04-27 06:49:57.749133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d8d800d8 cdw11:d800d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.749150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.986 [2024-04-27 06:49:57.749203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d8d800d8 cdw11:d800d0d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.986 [2024-04-27 06:49:57.749216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.987 [2024-04-27 06:49:57.749271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d8d80025 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.987 [2024-04-27 06:49:57.749284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.987 [2024-04-27 06:49:57.749339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:d8d800d5 cdw11:d800d81c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.987 [2024-04-27 06:49:57.749353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.987 #65 NEW cov: 11725 ft: 15188 corp: 45/978b lim: 35 exec/s: 32 rss: 70Mb L: 35/35 MS: 1 InsertByte- 00:07:27.987 #65 DONE cov: 11725 ft: 15188 corp: 45/978b lim: 35 exec/s: 32 rss: 70Mb 00:07:27.987 ###### Recommended dictionary. ###### 00:07:27.987 "\000\000\000\000" # Uses: 2 00:07:27.987 "\000\000\000\000\000\000\000\000" # Uses: 0 00:07:27.987 "\001\000\000\000\000\000\000\002" # Uses: 0 00:07:27.987 ###### End of recommended dictionary. ###### 00:07:27.987 Done 65 runs in 2 second(s) 00:07:28.246 06:49:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:28.246 06:49:57 -- ../common.sh@72 -- # (( i++ )) 00:07:28.246 06:49:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:28.246 06:49:57 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:28.246 06:49:57 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:28.246 06:49:57 -- nvmf/run.sh@24 -- # local timen=1 00:07:28.246 06:49:57 -- nvmf/run.sh@25 -- # local core=0x1 00:07:28.246 06:49:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:28.246 06:49:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:28.246 06:49:57 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:28.246 06:49:57 -- nvmf/run.sh@29 -- # port=4403 00:07:28.246 06:49:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:28.246 06:49:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:28.246 06:49:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:28.247 06:49:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:28.247 [2024-04-27 06:49:57.932350] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:28.247 [2024-04-27 06:49:57.932455] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2619822 ] 00:07:28.247 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.247 [2024-04-27 06:49:58.109179] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.247 [2024-04-27 06:49:58.128660] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:28.247 [2024-04-27 06:49:58.128789] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.506 [2024-04-27 06:49:58.180304] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:28.506 [2024-04-27 06:49:58.196631] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:28.506 INFO: Running with entropic power schedule (0xFF, 100). 00:07:28.506 INFO: Seed: 1088499566 00:07:28.506 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:28.506 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:28.506 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:28.506 INFO: A corpus is not provided, starting from an empty corpus 00:07:28.506 #2 INITED exec/s: 0 rss: 59Mb 00:07:28.506 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:28.506 This may also happen if the target rejected all inputs we tried so far 00:07:28.506 [2024-04-27 06:49:58.242171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.506 [2024-04-27 06:49:58.242201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.766 NEW_FUNC[1/672]: 0x4a25d0 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:28.766 NEW_FUNC[2/672]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:28.766 #5 NEW cov: 11723 ft: 11709 corp: 2/17b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 3 ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:28.766 [2024-04-27 06:49:58.563117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.766 [2024-04-27 06:49:58.563149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.766 #6 NEW cov: 11851 ft: 12194 corp: 3/36b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:28.766 [2024-04-27 06:49:58.613152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.766 [2024-04-27 06:49:58.613182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.766 #7 NEW cov: 11857 ft: 12409 corp: 4/55b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:28.766 [2024-04-27 06:49:58.653318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.766 [2024-04-27 06:49:58.653346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.026 #13 NEW cov: 11942 ft: 12796 corp: 5/71b lim: 20 exec/s: 0 rss: 67Mb L: 16/19 MS: 1 ShuffleBytes- 00:07:29.026 #14 NEW cov: 11947 ft: 13383 corp: 6/81b lim: 20 exec/s: 0 rss: 67Mb L: 10/19 MS: 1 CrossOver- 00:07:29.026 [2024-04-27 06:49:58.733692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.026 [2024-04-27 06:49:58.733720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.026 #15 NEW cov: 11947 ft: 13536 corp: 7/101b lim: 20 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 InsertByte- 00:07:29.026 [2024-04-27 06:49:58.773618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.026 [2024-04-27 06:49:58.773645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.026 #16 NEW cov: 11947 ft: 13618 corp: 8/118b lim: 20 exec/s: 0 rss: 67Mb L: 17/20 MS: 1 EraseBytes- 00:07:29.026 #20 NEW cov: 11947 ft: 13989 corp: 9/124b lim: 20 exec/s: 0 rss: 68Mb L: 6/20 MS: 4 InsertByte-ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:07:29.026 [2024-04-27 06:49:58.853815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.026 [2024-04-27 06:49:58.853843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.026 #21 NEW cov: 11947 ft: 14014 corp: 10/143b lim: 20 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 InsertRepeatedBytes- 00:07:29.026 [2024-04-27 06:49:58.894003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.026 [2024-04-27 06:49:58.894030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.026 #27 NEW cov: 11947 ft: 14144 corp: 11/162b lim: 20 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 ChangeBinInt- 00:07:29.286 [2024-04-27 06:49:58.934037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.286 [2024-04-27 06:49:58.934063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.286 #28 NEW cov: 11947 ft: 14154 corp: 12/178b lim: 20 exec/s: 0 rss: 68Mb L: 16/20 MS: 1 ChangeByte- 00:07:29.286 [2024-04-27 06:49:58.974176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.286 [2024-04-27 06:49:58.974202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.286 #29 NEW cov: 11947 ft: 14181 corp: 13/197b lim: 20 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 ChangeBinInt- 00:07:29.286 #35 NEW cov: 11947 ft: 14213 corp: 14/215b lim: 20 exec/s: 0 rss: 68Mb L: 18/20 MS: 1 InsertRepeatedBytes- 00:07:29.286 [2024-04-27 06:49:59.054450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.286 [2024-04-27 06:49:59.054476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.286 #36 NEW cov: 11947 ft: 14232 corp: 15/234b lim: 20 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 ChangeBit- 00:07:29.286 #37 NEW cov: 11947 ft: 14254 corp: 16/244b lim: 20 exec/s: 0 rss: 68Mb L: 10/20 MS: 1 ChangeBit- 00:07:29.286 [2024-04-27 06:49:59.134823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.286 [2024-04-27 06:49:59.134849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.286 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:29.286 #38 NEW cov: 11970 ft: 14323 corp: 17/264b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:29.546 #39 NEW cov: 11970 ft: 14364 corp: 18/270b lim: 20 exec/s: 0 rss: 68Mb L: 6/20 MS: 1 EraseBytes- 00:07:29.546 #44 NEW cov: 11970 ft: 14424 corp: 19/276b lim: 20 exec/s: 44 rss: 69Mb L: 6/20 MS: 5 ShuffleBytes-CopyPart-InsertByte-CrossOver-InsertRepeatedBytes- 00:07:29.546 [2024-04-27 06:49:59.265271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.546 [2024-04-27 06:49:59.265298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.546 #45 NEW cov: 11970 ft: 14476 corp: 20/296b lim: 20 exec/s: 45 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:29.546 [2024-04-27 06:49:59.305026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.546 [2024-04-27 06:49:59.305053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.546 #46 NEW cov: 11974 ft: 14598 corp: 21/309b lim: 20 exec/s: 46 rss: 69Mb L: 13/20 MS: 1 EraseBytes- 00:07:29.546 [2024-04-27 06:49:59.345197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.546 [2024-04-27 06:49:59.345224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.546 #47 NEW cov: 11974 ft: 14653 corp: 22/328b lim: 20 exec/s: 47 rss: 69Mb L: 19/20 MS: 1 CopyPart- 00:07:29.546 #48 NEW cov: 11974 ft: 14666 corp: 23/335b lim: 20 exec/s: 48 rss: 69Mb L: 7/20 MS: 1 InsertByte- 00:07:29.546 [2024-04-27 06:49:59.425386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.546 [2024-04-27 06:49:59.425416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.806 #49 NEW cov: 11974 ft: 14707 corp: 24/352b lim: 20 exec/s: 49 rss: 69Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:29.806 #50 NEW cov: 11974 ft: 14713 corp: 25/362b lim: 20 exec/s: 50 rss: 69Mb L: 10/20 MS: 1 ChangeBit- 00:07:29.806 #51 NEW cov: 11974 ft: 14722 corp: 26/369b lim: 20 exec/s: 51 rss: 69Mb L: 7/20 MS: 1 ChangeBinInt- 00:07:29.806 [2024-04-27 06:49:59.545540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.806 [2024-04-27 06:49:59.545568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.806 #52 NEW cov: 11974 ft: 14730 corp: 27/379b lim: 20 exec/s: 52 rss: 69Mb L: 10/20 MS: 1 CrossOver- 00:07:29.806 [2024-04-27 06:49:59.585907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.806 [2024-04-27 06:49:59.585933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.806 #53 NEW cov: 11974 ft: 14733 corp: 28/397b lim: 20 exec/s: 53 rss: 69Mb L: 18/20 MS: 1 InsertRepeatedBytes- 00:07:29.806 [2024-04-27 06:49:59.625722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.806 [2024-04-27 06:49:59.625748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.806 #54 NEW cov: 11974 ft: 14736 corp: 29/407b lim: 20 exec/s: 54 rss: 70Mb L: 10/20 MS: 1 CrossOver- 00:07:29.806 [2024-04-27 06:49:59.666085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.806 [2024-04-27 06:49:59.666112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.806 #55 NEW cov: 11974 ft: 14750 corp: 30/426b lim: 20 exec/s: 55 rss: 70Mb L: 19/20 MS: 1 CrossOver- 00:07:30.066 #56 NEW cov: 11974 ft: 14761 corp: 31/432b lim: 20 exec/s: 56 rss: 70Mb L: 6/20 MS: 1 ChangeBit- 00:07:30.066 [2024-04-27 06:49:59.746610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.066 [2024-04-27 06:49:59.746637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.066 #57 NEW cov: 11974 ft: 14767 corp: 32/452b lim: 20 exec/s: 57 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:30.066 #58 NEW cov: 11974 ft: 14825 corp: 33/469b lim: 20 exec/s: 58 rss: 70Mb L: 17/20 MS: 1 ChangeBit- 00:07:30.066 [2024-04-27 06:49:59.826667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.066 [2024-04-27 06:49:59.826694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.066 #59 NEW cov: 11974 ft: 14911 corp: 34/488b lim: 20 exec/s: 59 rss: 70Mb L: 19/20 MS: 1 ChangeBinInt- 00:07:30.066 #65 NEW cov: 11974 ft: 15005 corp: 35/507b lim: 20 exec/s: 65 rss: 70Mb L: 19/20 MS: 1 ChangeBinInt- 00:07:30.066 [2024-04-27 06:49:59.907012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.066 [2024-04-27 06:49:59.907039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.066 #66 NEW cov: 11974 ft: 15013 corp: 36/527b lim: 20 exec/s: 66 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:30.325 #67 NEW cov: 11974 ft: 15083 corp: 37/537b lim: 20 exec/s: 67 rss: 70Mb L: 10/20 MS: 1 ChangeByte- 00:07:30.325 [2024-04-27 06:49:59.987102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.325 [2024-04-27 06:49:59.987131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.325 #68 NEW cov: 11974 ft: 15107 corp: 38/554b lim: 20 exec/s: 68 rss: 70Mb L: 17/20 MS: 1 InsertRepeatedBytes- 00:07:30.325 #69 NEW cov: 11974 ft: 15127 corp: 39/560b lim: 20 exec/s: 69 rss: 70Mb L: 6/20 MS: 1 ChangeBit- 00:07:30.325 [2024-04-27 06:50:00.067554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.325 [2024-04-27 06:50:00.067581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.325 #70 NEW cov: 11974 ft: 15146 corp: 40/580b lim: 20 exec/s: 70 rss: 70Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:30.325 [2024-04-27 06:50:00.107325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.325 [2024-04-27 06:50:00.107352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.325 #71 NEW cov: 11974 ft: 15156 corp: 41/593b lim: 20 exec/s: 71 rss: 70Mb L: 13/20 MS: 1 ShuffleBytes- 00:07:30.325 [2024-04-27 06:50:00.147623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.325 [2024-04-27 06:50:00.147648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.325 #72 NEW cov: 11974 ft: 15164 corp: 42/612b lim: 20 exec/s: 72 rss: 70Mb L: 19/20 MS: 1 ShuffleBytes- 00:07:30.325 [2024-04-27 06:50:00.187732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.325 [2024-04-27 06:50:00.187757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.325 #73 NEW cov: 11974 ft: 15225 corp: 43/629b lim: 20 exec/s: 73 rss: 70Mb L: 17/20 MS: 1 CMP- DE: "\000\000\000\000\000\000\003\255"- 00:07:30.584 [2024-04-27 06:50:00.227869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.584 [2024-04-27 06:50:00.227897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.584 #74 NEW cov: 11974 ft: 15240 corp: 44/648b lim: 20 exec/s: 37 rss: 70Mb L: 19/20 MS: 1 ChangeBinInt- 00:07:30.584 #74 DONE cov: 11974 ft: 15240 corp: 44/648b lim: 20 exec/s: 37 rss: 70Mb 00:07:30.584 ###### Recommended dictionary. ###### 00:07:30.584 "\000\000\000\000\000\000\003\255" # Uses: 0 00:07:30.584 ###### End of recommended dictionary. ###### 00:07:30.584 Done 74 runs in 2 second(s) 00:07:30.584 06:50:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:30.584 06:50:00 -- ../common.sh@72 -- # (( i++ )) 00:07:30.584 06:50:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:30.584 06:50:00 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:30.584 06:50:00 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:30.584 06:50:00 -- nvmf/run.sh@24 -- # local timen=1 00:07:30.584 06:50:00 -- nvmf/run.sh@25 -- # local core=0x1 00:07:30.584 06:50:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:30.584 06:50:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:30.584 06:50:00 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:30.584 06:50:00 -- nvmf/run.sh@29 -- # port=4404 00:07:30.584 06:50:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:30.584 06:50:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:30.584 06:50:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:30.584 06:50:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:30.584 [2024-04-27 06:50:00.402685] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:30.584 [2024-04-27 06:50:00.402759] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2620223 ] 00:07:30.584 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.842 [2024-04-27 06:50:00.583465] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.842 [2024-04-27 06:50:00.602955] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:30.842 [2024-04-27 06:50:00.603083] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.842 [2024-04-27 06:50:00.654850] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:30.842 [2024-04-27 06:50:00.671175] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:30.842 INFO: Running with entropic power schedule (0xFF, 100). 00:07:30.842 INFO: Seed: 3563511062 00:07:30.842 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:30.842 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:30.842 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:30.842 INFO: A corpus is not provided, starting from an empty corpus 00:07:30.842 #2 INITED exec/s: 0 rss: 60Mb 00:07:30.842 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:30.842 This may also happen if the target rejected all inputs we tried so far 00:07:30.842 [2024-04-27 06:50:00.726654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.842 [2024-04-27 06:50:00.726684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.842 [2024-04-27 06:50:00.726737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.842 [2024-04-27 06:50:00.726750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.842 [2024-04-27 06:50:00.726802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.842 [2024-04-27 06:50:00.726816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.359 NEW_FUNC[1/664]: 0x4a36c0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:31.359 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:31.359 #11 NEW cov: 11510 ft: 11510 corp: 2/22b lim: 35 exec/s: 0 rss: 69Mb L: 21/21 MS: 4 ShuffleBytes-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:31.359 [2024-04-27 06:50:01.047631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.047679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.359 [2024-04-27 06:50:01.047748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff73ffff cdw11:e15f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.047770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.359 [2024-04-27 06:50:01.047836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7800e911 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.047872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.359 #12 NEW cov: 11623 ft: 11968 corp: 3/43b lim: 35 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 CMP- DE: "s\341_d\351\021x\000"- 00:07:31.359 [2024-04-27 06:50:01.097179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e15f0a73 cdw11:64e90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.097207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.359 #13 NEW cov: 11629 ft: 13011 corp: 4/52b lim: 35 exec/s: 0 rss: 69Mb L: 9/21 MS: 1 PersAutoDict- DE: "s\341_d\351\021x\000"- 00:07:31.359 [2024-04-27 06:50:01.137308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:771100ff cdw11:e3ce0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.137334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.359 #16 NEW cov: 11714 ft: 13364 corp: 5/61b lim: 35 exec/s: 0 rss: 69Mb L: 9/21 MS: 3 ChangeByte-ChangeByte-CMP- DE: "\377w\021\343\316?\011z"- 00:07:31.359 [2024-04-27 06:50:01.177468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a43 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.177494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.359 #17 NEW cov: 11714 ft: 13445 corp: 6/73b lim: 35 exec/s: 0 rss: 69Mb L: 12/21 MS: 1 CrossOver- 00:07:31.359 [2024-04-27 06:50:01.217553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:73e1780a cdw11:5f640003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.217579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.359 #20 NEW cov: 11714 ft: 13482 corp: 7/83b lim: 35 exec/s: 0 rss: 69Mb L: 10/21 MS: 3 CrossOver-ShuffleBytes-CrossOver- 00:07:31.359 [2024-04-27 06:50:01.248083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.248108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.359 [2024-04-27 06:50:01.248174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.248187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.359 [2024-04-27 06:50:01.248237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.359 [2024-04-27 06:50:01.248251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.359 [2024-04-27 06:50:01.248298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.360 [2024-04-27 06:50:01.248311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.619 #21 NEW cov: 11714 ft: 13925 corp: 8/115b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:31.619 [2024-04-27 06:50:01.288048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.619 [2024-04-27 06:50:01.288073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.619 [2024-04-27 06:50:01.288130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:32ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.619 [2024-04-27 06:50:01.288143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.619 [2024-04-27 06:50:01.288195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.619 [2024-04-27 06:50:01.288208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.619 #22 NEW cov: 11714 ft: 13971 corp: 9/137b lim: 35 exec/s: 0 rss: 69Mb L: 22/32 MS: 1 InsertByte- 00:07:31.619 [2024-04-27 06:50:01.328341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.619 [2024-04-27 06:50:01.328366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.619 [2024-04-27 06:50:01.328420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:32ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.619 [2024-04-27 06:50:01.328433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.328482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:32ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.328512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.328564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.328577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.620 #23 NEW cov: 11714 ft: 13988 corp: 10/170b lim: 35 exec/s: 0 rss: 70Mb L: 33/33 MS: 1 CopyPart- 00:07:31.620 [2024-04-27 06:50:01.368266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.368291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.368341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.368354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.368409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.368422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.620 #24 NEW cov: 11714 ft: 14028 corp: 11/191b lim: 35 exec/s: 0 rss: 70Mb L: 21/33 MS: 1 ChangeByte- 00:07:31.620 [2024-04-27 06:50:01.398525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.398550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.398600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:32ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.398613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.398664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:34ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.398678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.398728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.398740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.620 #25 NEW cov: 11714 ft: 14105 corp: 12/224b lim: 35 exec/s: 0 rss: 70Mb L: 33/33 MS: 1 ChangeASCIIInt- 00:07:31.620 [2024-04-27 06:50:01.438516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c6ff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.438541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.438593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.438607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.438657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.438671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.620 #26 NEW cov: 11714 ft: 14111 corp: 13/245b lim: 35 exec/s: 0 rss: 70Mb L: 21/33 MS: 1 ChangeByte- 00:07:31.620 [2024-04-27 06:50:01.468606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.468630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.468683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:30ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.468696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.620 [2024-04-27 06:50:01.468748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.468760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.620 #27 NEW cov: 11714 ft: 14192 corp: 14/267b lim: 35 exec/s: 0 rss: 70Mb L: 22/33 MS: 1 ChangeASCIIInt- 00:07:31.620 [2024-04-27 06:50:01.508402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:771100ff cdw11:e3ce0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.620 [2024-04-27 06:50:01.508426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.880 #28 NEW cov: 11714 ft: 14213 corp: 15/276b lim: 35 exec/s: 0 rss: 70Mb L: 9/33 MS: 1 ChangeBinInt- 00:07:31.880 [2024-04-27 06:50:01.548495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e15f0a73 cdw11:f0e90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.548520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.880 #29 NEW cov: 11714 ft: 14303 corp: 16/285b lim: 35 exec/s: 0 rss: 70Mb L: 9/33 MS: 1 ChangeByte- 00:07:31.880 [2024-04-27 06:50:01.589030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.589058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.880 [2024-04-27 06:50:01.589109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.589123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.880 [2024-04-27 06:50:01.589173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:73e10002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.589201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.880 [2024-04-27 06:50:01.589253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:117864e9 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.589266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.880 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:31.880 #30 NEW cov: 11737 ft: 14325 corp: 17/314b lim: 35 exec/s: 0 rss: 70Mb L: 29/33 MS: 1 PersAutoDict- DE: "s\341_d\351\021x\000"- 00:07:31.880 [2024-04-27 06:50:01.628703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c1100ff cdw11:e3ce0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.628728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.880 #31 NEW cov: 11737 ft: 14341 corp: 18/323b lim: 35 exec/s: 0 rss: 70Mb L: 9/33 MS: 1 ChangeBinInt- 00:07:31.880 [2024-04-27 06:50:01.668987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c1100ff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.669013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.880 [2024-04-27 06:50:01.669080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.669094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.880 #32 NEW cov: 11737 ft: 14631 corp: 19/341b lim: 35 exec/s: 0 rss: 70Mb L: 18/33 MS: 1 CrossOver- 00:07:31.880 [2024-04-27 06:50:01.708950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:771100ff cdw11:e37e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.708976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.880 #33 NEW cov: 11737 ft: 14650 corp: 20/351b lim: 35 exec/s: 33 rss: 70Mb L: 10/33 MS: 1 InsertByte- 00:07:31.880 [2024-04-27 06:50:01.749370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.749398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.880 [2024-04-27 06:50:01.749468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff73ffff cdw11:e15f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.749482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.880 [2024-04-27 06:50:01.749533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffe9ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.880 [2024-04-27 06:50:01.749546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.880 #34 NEW cov: 11737 ft: 14679 corp: 21/372b lim: 35 exec/s: 34 rss: 70Mb L: 21/33 MS: 1 CopyPart- 00:07:32.140 [2024-04-27 06:50:01.789166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f6473e1 cdw11:e9110002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.140 [2024-04-27 06:50:01.789191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.141 #36 NEW cov: 11737 ft: 14688 corp: 22/381b lim: 35 exec/s: 36 rss: 70Mb L: 9/33 MS: 2 ChangeByte-PersAutoDict- DE: "s\341_d\351\021x\000"- 00:07:32.141 [2024-04-27 06:50:01.819382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7d1100ff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.819411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:01.819466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.819479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.141 #37 NEW cov: 11737 ft: 14727 corp: 23/399b lim: 35 exec/s: 37 rss: 70Mb L: 18/33 MS: 1 ChangeBit- 00:07:32.141 [2024-04-27 06:50:01.859527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:777300ff cdw11:e15f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.859552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:01.859605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:7800e911 cdw11:11e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.859618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.141 #38 NEW cov: 11737 ft: 14731 corp: 24/416b lim: 35 exec/s: 38 rss: 70Mb L: 17/33 MS: 1 PersAutoDict- DE: "s\341_d\351\021x\000"- 00:07:32.141 [2024-04-27 06:50:01.899653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff77009c cdw11:73e10002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.899678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:01.899729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:117864e9 cdw11:00110003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.899742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.141 #39 NEW cov: 11737 ft: 14763 corp: 25/434b lim: 35 exec/s: 39 rss: 70Mb L: 18/33 MS: 1 InsertByte- 00:07:32.141 [2024-04-27 06:50:01.939763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:777300ff cdw11:e15f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.939787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:01.939832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:7800e911 cdw11:11e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.939846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.141 #40 NEW cov: 11737 ft: 14777 corp: 26/452b lim: 35 exec/s: 40 rss: 70Mb L: 18/33 MS: 1 InsertByte- 00:07:32.141 [2024-04-27 06:50:01.980172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c1100ff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.980196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:01.980252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.980266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:01.980314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.980326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:01.980377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffe3ffff cdw11:ce3f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:01.980390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.141 #41 NEW cov: 11737 ft: 14790 corp: 27/481b lim: 35 exec/s: 41 rss: 70Mb L: 29/33 MS: 1 InsertRepeatedBytes- 00:07:32.141 [2024-04-27 06:50:02.020117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:02.020141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:02.020194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:02.020207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.141 [2024-04-27 06:50:02.020253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.141 [2024-04-27 06:50:02.020266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.402 #42 NEW cov: 11737 ft: 14807 corp: 28/508b lim: 35 exec/s: 42 rss: 70Mb L: 27/33 MS: 1 InsertRepeatedBytes- 00:07:32.402 [2024-04-27 06:50:02.050382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c1100ff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.402 [2024-04-27 06:50:02.050412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.402 [2024-04-27 06:50:02.050467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.402 [2024-04-27 06:50:02.050481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.402 [2024-04-27 06:50:02.050532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.402 [2024-04-27 06:50:02.050546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.402 [2024-04-27 06:50:02.050596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ceffffff cdw11:3fe30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.402 [2024-04-27 06:50:02.050609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.402 #43 NEW cov: 11737 ft: 14813 corp: 29/537b lim: 35 exec/s: 43 rss: 70Mb L: 29/33 MS: 1 ShuffleBytes- 00:07:32.402 [2024-04-27 06:50:02.090328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.402 [2024-04-27 06:50:02.090352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.090429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.090444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.090496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.090511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.403 #44 NEW cov: 11737 ft: 14822 corp: 30/558b lim: 35 exec/s: 44 rss: 70Mb L: 21/33 MS: 1 ChangeBit- 00:07:32.403 [2024-04-27 06:50:02.130447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.130473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.130527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffdfffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.130540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.130592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.130621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.403 #45 NEW cov: 11737 ft: 14887 corp: 31/579b lim: 35 exec/s: 45 rss: 70Mb L: 21/33 MS: 1 ChangeBit- 00:07:32.403 [2024-04-27 06:50:02.160526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.160550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.160603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:40ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.160616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.160668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.160681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.403 #46 NEW cov: 11737 ft: 14925 corp: 32/606b lim: 35 exec/s: 46 rss: 70Mb L: 27/33 MS: 1 ChangeByte- 00:07:32.403 [2024-04-27 06:50:02.200693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c1100ff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.200718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.200771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.200785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.200834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7711ceff cdw11:e3ce0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.200847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.403 #47 NEW cov: 11737 ft: 14931 corp: 33/632b lim: 35 exec/s: 47 rss: 70Mb L: 26/33 MS: 1 PersAutoDict- DE: "\377w\021\343\316?\011z"- 00:07:32.403 [2024-04-27 06:50:02.240497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:931100ff cdw11:e3ce0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.240521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.403 #48 NEW cov: 11737 ft: 14963 corp: 34/641b lim: 35 exec/s: 48 rss: 70Mb L: 9/33 MS: 1 ChangeByte- 00:07:32.403 [2024-04-27 06:50:02.280874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.280900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.280952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff730003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.280965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.403 [2024-04-27 06:50:02.281015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e9ff5f64 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.403 [2024-04-27 06:50:02.281028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.663 #49 NEW cov: 11737 ft: 14986 corp: 35/662b lim: 35 exec/s: 49 rss: 70Mb L: 21/33 MS: 1 CopyPart- 00:07:32.663 [2024-04-27 06:50:02.320710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:92929292 cdw11:92920001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.320736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.663 #50 NEW cov: 11737 ft: 15003 corp: 36/672b lim: 35 exec/s: 50 rss: 70Mb L: 10/33 MS: 1 InsertRepeatedBytes- 00:07:32.663 [2024-04-27 06:50:02.360836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:931100ff cdw11:e3e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.360862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.663 #51 NEW cov: 11737 ft: 15014 corp: 37/681b lim: 35 exec/s: 51 rss: 70Mb L: 9/33 MS: 1 CopyPart- 00:07:32.663 [2024-04-27 06:50:02.401413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.401439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.401489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e15fff73 cdw11:64e90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.401501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.401549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff7800 cdw11:ff0a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.401577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.401627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f0e9e15f cdw11:11780000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.401641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.663 #52 NEW cov: 11737 ft: 15022 corp: 38/709b lim: 35 exec/s: 52 rss: 70Mb L: 28/33 MS: 1 CrossOver- 00:07:32.663 [2024-04-27 06:50:02.441245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff77009c cdw11:73e10002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.441270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.441319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5f6473e1 cdw11:e9110002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.441332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.663 #53 NEW cov: 11737 ft: 15040 corp: 39/727b lim: 35 exec/s: 53 rss: 70Mb L: 18/33 MS: 1 CopyPart- 00:07:32.663 [2024-04-27 06:50:02.481512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.481537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.481586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.481598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.481648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff25ffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.481661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.663 #54 NEW cov: 11737 ft: 15056 corp: 40/749b lim: 35 exec/s: 54 rss: 70Mb L: 22/33 MS: 1 InsertByte- 00:07:32.663 [2024-04-27 06:50:02.511695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.511720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.511772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e15fff73 cdw11:64e90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.511785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.511834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00007800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.511848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.511897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:0a730003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.511909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.663 #55 NEW cov: 11737 ft: 15073 corp: 41/783b lim: 35 exec/s: 55 rss: 71Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:32.663 [2024-04-27 06:50:02.551716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f6473e1 cdw11:e9110002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.551741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.551794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:25ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.551807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.663 [2024-04-27 06:50:02.551861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.663 [2024-04-27 06:50:02.551874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.923 #56 NEW cov: 11737 ft: 15094 corp: 42/804b lim: 35 exec/s: 56 rss: 71Mb L: 21/34 MS: 1 PersAutoDict- DE: "s\341_d\351\021x\000"- 00:07:32.923 [2024-04-27 06:50:02.591529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a735f64 cdw11:e15f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.923 [2024-04-27 06:50:02.591555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.923 #57 NEW cov: 11737 ft: 15106 corp: 43/815b lim: 35 exec/s: 57 rss: 71Mb L: 11/34 MS: 1 CrossOver- 00:07:32.923 [2024-04-27 06:50:02.632079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.923 [2024-04-27 06:50:02.632105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.924 [2024-04-27 06:50:02.632155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:32ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.632168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.924 [2024-04-27 06:50:02.632219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:32730003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.632233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.924 [2024-04-27 06:50:02.632282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e9115f64 cdw11:78000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.632295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.924 #58 NEW cov: 11737 ft: 15114 corp: 44/848b lim: 35 exec/s: 58 rss: 71Mb L: 33/34 MS: 1 PersAutoDict- DE: "s\341_d\351\021x\000"- 00:07:32.924 [2024-04-27 06:50:02.672030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.672056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.924 [2024-04-27 06:50:02.672109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:32ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.672122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.924 [2024-04-27 06:50:02.672170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.672198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.924 #59 NEW cov: 11737 ft: 15140 corp: 45/873b lim: 35 exec/s: 59 rss: 71Mb L: 25/34 MS: 1 EraseBytes- 00:07:32.924 [2024-04-27 06:50:02.712280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff43ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.712305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.924 [2024-04-27 06:50:02.712356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:32ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.712373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.924 [2024-04-27 06:50:02.712427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:00ff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.712440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.924 [2024-04-27 06:50:02.712489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ce3f11e3 cdw11:09000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.924 [2024-04-27 06:50:02.712501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.924 #60 NEW cov: 11737 ft: 15148 corp: 46/906b lim: 35 exec/s: 30 rss: 71Mb L: 33/34 MS: 1 CrossOver- 00:07:32.924 #60 DONE cov: 11737 ft: 15148 corp: 46/906b lim: 35 exec/s: 30 rss: 71Mb 00:07:32.924 ###### Recommended dictionary. ###### 00:07:32.924 "s\341_d\351\021x\000" # Uses: 6 00:07:32.924 "\377w\021\343\316?\011z" # Uses: 1 00:07:32.924 ###### End of recommended dictionary. ###### 00:07:32.924 Done 60 runs in 2 second(s) 00:07:33.190 06:50:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:33.190 06:50:02 -- ../common.sh@72 -- # (( i++ )) 00:07:33.190 06:50:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:33.190 06:50:02 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:33.190 06:50:02 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:33.190 06:50:02 -- nvmf/run.sh@24 -- # local timen=1 00:07:33.190 06:50:02 -- nvmf/run.sh@25 -- # local core=0x1 00:07:33.190 06:50:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:33.190 06:50:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:33.190 06:50:02 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:33.190 06:50:02 -- nvmf/run.sh@29 -- # port=4405 00:07:33.190 06:50:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:33.190 06:50:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:33.190 06:50:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:33.190 06:50:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:33.190 [2024-04-27 06:50:02.894725] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:33.190 [2024-04-27 06:50:02.894827] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2620766 ] 00:07:33.190 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.190 [2024-04-27 06:50:03.072819] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.574 [2024-04-27 06:50:03.092530] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.574 [2024-04-27 06:50:03.092660] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.574 [2024-04-27 06:50:03.144204] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:33.574 [2024-04-27 06:50:03.160529] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:33.574 INFO: Running with entropic power schedule (0xFF, 100). 00:07:33.574 INFO: Seed: 1757535721 00:07:33.574 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:33.574 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:33.574 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:33.574 INFO: A corpus is not provided, starting from an empty corpus 00:07:33.574 #2 INITED exec/s: 0 rss: 59Mb 00:07:33.574 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:33.574 This may also happen if the target rejected all inputs we tried so far 00:07:33.574 [2024-04-27 06:50:03.205747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.575 [2024-04-27 06:50:03.205776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.861 NEW_FUNC[1/664]: 0x4a5850 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:33.861 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:33.861 #14 NEW cov: 11513 ft: 11522 corp: 2/12b lim: 45 exec/s: 0 rss: 67Mb L: 11/11 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:33.861 [2024-04-27 06:50:03.516819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1b1b0a1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.861 [2024-04-27 06:50:03.516851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.861 [2024-04-27 06:50:03.516908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.861 [2024-04-27 06:50:03.516923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.861 [2024-04-27 06:50:03.516977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.861 [2024-04-27 06:50:03.516992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.861 #20 NEW cov: 11634 ft: 12589 corp: 3/43b lim: 45 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:33.861 [2024-04-27 06:50:03.556526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.861 [2024-04-27 06:50:03.556552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.861 #21 NEW cov: 11640 ft: 12917 corp: 4/58b lim: 45 exec/s: 0 rss: 67Mb L: 15/31 MS: 1 CMP- DE: "\001\000\000\021"- 00:07:33.861 [2024-04-27 06:50:03.596619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5b5b5b5b cdw11:5b5b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.861 [2024-04-27 06:50:03.596644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.861 #27 NEW cov: 11725 ft: 13318 corp: 5/72b lim: 45 exec/s: 0 rss: 67Mb L: 14/31 MS: 1 InsertRepeatedBytes- 00:07:33.861 [2024-04-27 06:50:03.636731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2bafffff cdw11:a6fc0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.861 [2024-04-27 06:50:03.636756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.861 #28 NEW cov: 11725 ft: 13400 corp: 6/83b lim: 45 exec/s: 0 rss: 67Mb L: 11/31 MS: 1 CMP- DE: "+\257\246\374\265\006\000\000"- 00:07:33.861 [2024-04-27 06:50:03.676827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2bafffff cdw11:71a60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.861 [2024-04-27 06:50:03.676853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.861 #29 NEW cov: 11725 ft: 13449 corp: 7/95b lim: 45 exec/s: 0 rss: 67Mb L: 12/31 MS: 1 InsertByte- 00:07:33.861 [2024-04-27 06:50:03.717006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2bafffff cdw11:a6fc0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.862 [2024-04-27 06:50:03.717034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.862 #30 NEW cov: 11725 ft: 13481 corp: 8/107b lim: 45 exec/s: 0 rss: 68Mb L: 12/31 MS: 1 PersAutoDict- DE: "+\257\246\374\265\006\000\000"- 00:07:33.862 [2024-04-27 06:50:03.757162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.862 [2024-04-27 06:50:03.757189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.121 #31 NEW cov: 11725 ft: 13509 corp: 9/122b lim: 45 exec/s: 0 rss: 68Mb L: 15/31 MS: 1 CopyPart- 00:07:34.121 [2024-04-27 06:50:03.797614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1b1b0a1b cdw11:1f1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.121 [2024-04-27 06:50:03.797639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.121 [2024-04-27 06:50:03.797696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.121 [2024-04-27 06:50:03.797726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.122 [2024-04-27 06:50:03.797782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.122 [2024-04-27 06:50:03.797796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.122 #32 NEW cov: 11725 ft: 13568 corp: 10/153b lim: 45 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeBinInt- 00:07:34.122 [2024-04-27 06:50:03.837304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2bafffff cdw11:a6fc0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.122 [2024-04-27 06:50:03.837330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.122 #33 NEW cov: 11725 ft: 13609 corp: 11/164b lim: 45 exec/s: 0 rss: 68Mb L: 11/31 MS: 1 PersAutoDict- DE: "+\257\246\374\265\006\000\000"- 00:07:34.122 [2024-04-27 06:50:03.877468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffbf cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.122 [2024-04-27 06:50:03.877494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.122 #34 NEW cov: 11725 ft: 13639 corp: 12/175b lim: 45 exec/s: 0 rss: 68Mb L: 11/31 MS: 1 ChangeBit- 00:07:34.122 [2024-04-27 06:50:03.917548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffbf cdw11:ff6e0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.122 [2024-04-27 06:50:03.917574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.122 #35 NEW cov: 11725 ft: 13696 corp: 13/186b lim: 45 exec/s: 0 rss: 68Mb L: 11/31 MS: 1 ChangeByte- 00:07:34.122 [2024-04-27 06:50:03.957687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2bafffff cdw11:71a60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.122 [2024-04-27 06:50:03.957712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.122 #36 NEW cov: 11725 ft: 13709 corp: 14/198b lim: 45 exec/s: 0 rss: 68Mb L: 12/31 MS: 1 ShuffleBytes- 00:07:34.122 [2024-04-27 06:50:03.987773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06fcffff cdw11:a6b50005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.122 [2024-04-27 06:50:03.987798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.122 #37 NEW cov: 11725 ft: 13721 corp: 15/209b lim: 45 exec/s: 0 rss: 68Mb L: 11/31 MS: 1 ShuffleBytes- 00:07:34.382 [2024-04-27 06:50:04.028220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1b1b0a1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.028246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.382 [2024-04-27 06:50:04.028304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.028318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.382 [2024-04-27 06:50:04.028372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.028386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.382 #38 NEW cov: 11725 ft: 13773 corp: 16/240b lim: 45 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeByte- 00:07:34.382 [2024-04-27 06:50:04.067980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffbf cdw11:ff6a0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.068005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.382 #39 NEW cov: 11725 ft: 13789 corp: 17/251b lim: 45 exec/s: 0 rss: 68Mb L: 11/31 MS: 1 ChangeBit- 00:07:34.382 [2024-04-27 06:50:04.108115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2bafffff cdw11:71a60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.108140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.382 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:34.382 #40 NEW cov: 11748 ft: 13832 corp: 18/264b lim: 45 exec/s: 0 rss: 68Mb L: 13/31 MS: 1 InsertByte- 00:07:34.382 [2024-04-27 06:50:04.148390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5b5b5b5b cdw11:5b5b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.148422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.382 [2024-04-27 06:50:04.148476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5b5b5b5b cdw11:5b5b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.148491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.382 #46 NEW cov: 11748 ft: 14115 corp: 19/283b lim: 45 exec/s: 0 rss: 69Mb L: 19/31 MS: 1 CopyPart- 00:07:34.382 [2024-04-27 06:50:04.188346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffbf cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.188371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.382 #47 NEW cov: 11748 ft: 14138 corp: 20/294b lim: 45 exec/s: 47 rss: 69Mb L: 11/31 MS: 1 ChangeBinInt- 00:07:34.382 [2024-04-27 06:50:04.228781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06fcffff cdw11:a6b50005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.228806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.382 [2024-04-27 06:50:04.228863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.228877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.382 [2024-04-27 06:50:04.228934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.228948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.382 #48 NEW cov: 11748 ft: 14142 corp: 21/323b lim: 45 exec/s: 48 rss: 69Mb L: 29/31 MS: 1 InsertRepeatedBytes- 00:07:34.382 [2024-04-27 06:50:04.269036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1b1b0a1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.269061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.382 [2024-04-27 06:50:04.269132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.269146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.382 [2024-04-27 06:50:04.269202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.269216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.382 [2024-04-27 06:50:04.269270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1b1b1b1b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.382 [2024-04-27 06:50:04.269284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.642 #49 NEW cov: 11748 ft: 14486 corp: 22/360b lim: 45 exec/s: 49 rss: 69Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:34.642 [2024-04-27 06:50:04.308675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5b5b5b5b cdw11:5b5b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.642 [2024-04-27 06:50:04.308700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.642 #50 NEW cov: 11748 ft: 14508 corp: 23/374b lim: 45 exec/s: 50 rss: 69Mb L: 14/37 MS: 1 CrossOver- 00:07:34.642 [2024-04-27 06:50:04.348775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffbf cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.642 [2024-04-27 06:50:04.348802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.642 #51 NEW cov: 11748 ft: 14564 corp: 24/385b lim: 45 exec/s: 51 rss: 69Mb L: 11/37 MS: 1 CrossOver- 00:07:34.642 [2024-04-27 06:50:04.389176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffbf cdw11:ff6a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.642 [2024-04-27 06:50:04.389210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.642 #52 NEW cov: 11748 ft: 14611 corp: 25/397b lim: 45 exec/s: 52 rss: 69Mb L: 12/37 MS: 1 InsertByte- 00:07:34.642 [2024-04-27 06:50:04.429030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.642 [2024-04-27 06:50:04.429056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.642 #53 NEW cov: 11748 ft: 14740 corp: 26/412b lim: 45 exec/s: 53 rss: 69Mb L: 15/37 MS: 1 ShuffleBytes- 00:07:34.642 [2024-04-27 06:50:04.469179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff03ffbf cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.642 [2024-04-27 06:50:04.469204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.642 #54 NEW cov: 11748 ft: 14753 corp: 27/423b lim: 45 exec/s: 54 rss: 69Mb L: 11/37 MS: 1 ChangeBinInt- 00:07:34.642 [2024-04-27 06:50:04.499259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2903ffbf cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.642 [2024-04-27 06:50:04.499284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.642 #55 NEW cov: 11748 ft: 14760 corp: 28/434b lim: 45 exec/s: 55 rss: 69Mb L: 11/37 MS: 1 ChangeByte- 00:07:34.901 [2024-04-27 06:50:04.539354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.901 [2024-04-27 06:50:04.539379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.901 #56 NEW cov: 11748 ft: 14777 corp: 29/446b lim: 45 exec/s: 56 rss: 69Mb L: 12/37 MS: 1 CrossOver- 00:07:34.901 [2024-04-27 06:50:04.569405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000ff01 cdw11:116a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.901 [2024-04-27 06:50:04.569430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.902 #57 NEW cov: 11748 ft: 14789 corp: 30/458b lim: 45 exec/s: 57 rss: 69Mb L: 12/37 MS: 1 PersAutoDict- DE: "\001\000\000\021"- 00:07:34.902 [2024-04-27 06:50:04.609534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.609559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.902 #58 NEW cov: 11748 ft: 14801 corp: 31/474b lim: 45 exec/s: 58 rss: 70Mb L: 16/37 MS: 1 InsertByte- 00:07:34.902 [2024-04-27 06:50:04.649680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fc71ffff cdw11:a6af0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.649706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.902 #59 NEW cov: 11748 ft: 14829 corp: 32/486b lim: 45 exec/s: 59 rss: 70Mb L: 12/37 MS: 1 ShuffleBytes- 00:07:34.902 [2024-04-27 06:50:04.680214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2bafffff cdw11:71a60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.680238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.902 [2024-04-27 06:50:04.680309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.680323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.902 [2024-04-27 06:50:04.680378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.680392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.902 [2024-04-27 06:50:04.680453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.680466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.902 #60 NEW cov: 11748 ft: 14847 corp: 33/524b lim: 45 exec/s: 60 rss: 70Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:34.902 [2024-04-27 06:50:04.719874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000ff07 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.719902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.902 #64 NEW cov: 11748 ft: 14870 corp: 34/535b lim: 45 exec/s: 64 rss: 70Mb L: 11/38 MS: 4 EraseBytes-ChangeBinInt-ChangeBinInt-PersAutoDict- DE: "\001\000\000\021"- 00:07:34.902 [2024-04-27 06:50:04.749947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cf50ffff cdw11:8e590000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.749971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.902 #65 NEW cov: 11748 ft: 14918 corp: 35/547b lim: 45 exec/s: 65 rss: 70Mb L: 12/38 MS: 1 ChangeBinInt- 00:07:34.902 [2024-04-27 06:50:04.790087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff01ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.902 [2024-04-27 06:50:04.790112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.161 #66 NEW cov: 11748 ft: 14944 corp: 36/558b lim: 45 exec/s: 66 rss: 70Mb L: 11/38 MS: 1 PersAutoDict- DE: "\001\000\000\021"- 00:07:35.161 [2024-04-27 06:50:04.820186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.820210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.161 #67 NEW cov: 11748 ft: 14984 corp: 37/573b lim: 45 exec/s: 67 rss: 70Mb L: 15/38 MS: 1 CopyPart- 00:07:35.161 [2024-04-27 06:50:04.860245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.860270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.161 #68 NEW cov: 11748 ft: 14995 corp: 38/588b lim: 45 exec/s: 68 rss: 70Mb L: 15/38 MS: 1 ChangeByte- 00:07:35.161 [2024-04-27 06:50:04.890699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3b1b0a1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.890724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.161 [2024-04-27 06:50:04.890782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.890796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.161 [2024-04-27 06:50:04.890853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.890867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.161 #69 NEW cov: 11748 ft: 15010 corp: 39/619b lim: 45 exec/s: 69 rss: 70Mb L: 31/38 MS: 1 ChangeBit- 00:07:35.161 [2024-04-27 06:50:04.930807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffbf cdw11:ff6e0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.930832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.161 [2024-04-27 06:50:04.930888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.930901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.161 [2024-04-27 06:50:04.930956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.930972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.161 #70 NEW cov: 11748 ft: 15028 corp: 40/650b lim: 45 exec/s: 70 rss: 70Mb L: 31/38 MS: 1 CrossOver- 00:07:35.161 [2024-04-27 06:50:04.970583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:04.970609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.161 #71 NEW cov: 11748 ft: 15052 corp: 41/666b lim: 45 exec/s: 71 rss: 70Mb L: 16/38 MS: 1 ChangeBit- 00:07:35.161 [2024-04-27 06:50:05.011171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1b1b0a1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:05.011195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.161 [2024-04-27 06:50:05.011267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:05.011280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.161 [2024-04-27 06:50:05.011334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:05.011347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.161 [2024-04-27 06:50:05.011402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1b1b1b1b cdw11:1b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:05.011416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.161 #72 NEW cov: 11748 ft: 15058 corp: 42/704b lim: 45 exec/s: 72 rss: 70Mb L: 38/38 MS: 1 InsertByte- 00:07:35.161 [2024-04-27 06:50:05.050818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:af71ff2b cdw11:a6a60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.161 [2024-04-27 06:50:05.050842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.421 #73 NEW cov: 11748 ft: 15107 corp: 43/716b lim: 45 exec/s: 73 rss: 70Mb L: 12/38 MS: 1 CrossOver- 00:07:35.421 [2024-04-27 06:50:05.081346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1b1b0a1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.421 [2024-04-27 06:50:05.081371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.421 [2024-04-27 06:50:05.081431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.421 [2024-04-27 06:50:05.081446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.421 [2024-04-27 06:50:05.081499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b3b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.421 [2024-04-27 06:50:05.081513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.421 [2024-04-27 06:50:05.081566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1b1b1b1b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.421 [2024-04-27 06:50:05.081579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.421 #74 NEW cov: 11748 ft: 15114 corp: 44/753b lim: 45 exec/s: 74 rss: 70Mb L: 37/38 MS: 1 ChangeBit- 00:07:35.421 [2024-04-27 06:50:05.120987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000ff07 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.421 [2024-04-27 06:50:05.121012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.421 #75 NEW cov: 11748 ft: 15138 corp: 45/764b lim: 45 exec/s: 75 rss: 70Mb L: 11/38 MS: 1 ChangeBinInt- 00:07:35.421 [2024-04-27 06:50:05.161622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1b1b0a1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.421 [2024-04-27 06:50:05.161648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.421 [2024-04-27 06:50:05.161703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1ba60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.422 [2024-04-27 06:50:05.161716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.422 [2024-04-27 06:50:05.161769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.422 [2024-04-27 06:50:05.161782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.422 [2024-04-27 06:50:05.161833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.422 [2024-04-27 06:50:05.161846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.422 #76 NEW cov: 11748 ft: 15154 corp: 46/804b lim: 45 exec/s: 76 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:07:35.422 [2024-04-27 06:50:05.201596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1b1b0a1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.422 [2024-04-27 06:50:05.201621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.422 [2024-04-27 06:50:05.201677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.422 [2024-04-27 06:50:05.201691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.422 [2024-04-27 06:50:05.201746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.422 [2024-04-27 06:50:05.201759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.422 #77 NEW cov: 11748 ft: 15163 corp: 47/838b lim: 45 exec/s: 38 rss: 70Mb L: 34/40 MS: 1 EraseBytes- 00:07:35.422 #77 DONE cov: 11748 ft: 15163 corp: 47/838b lim: 45 exec/s: 38 rss: 70Mb 00:07:35.422 ###### Recommended dictionary. ###### 00:07:35.422 "\001\000\000\021" # Uses: 3 00:07:35.422 "+\257\246\374\265\006\000\000" # Uses: 2 00:07:35.422 ###### End of recommended dictionary. ###### 00:07:35.422 Done 77 runs in 2 second(s) 00:07:35.681 06:50:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:35.681 06:50:05 -- ../common.sh@72 -- # (( i++ )) 00:07:35.681 06:50:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.681 06:50:05 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:35.681 06:50:05 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:35.681 06:50:05 -- nvmf/run.sh@24 -- # local timen=1 00:07:35.681 06:50:05 -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.681 06:50:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:35.681 06:50:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:35.681 06:50:05 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:35.681 06:50:05 -- nvmf/run.sh@29 -- # port=4406 00:07:35.681 06:50:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:35.681 06:50:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:35.681 06:50:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.681 06:50:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:35.681 [2024-04-27 06:50:05.374004] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:35.681 [2024-04-27 06:50:05.374077] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2621060 ] 00:07:35.681 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.681 [2024-04-27 06:50:05.554426] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.681 [2024-04-27 06:50:05.573874] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.681 [2024-04-27 06:50:05.574004] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.939 [2024-04-27 06:50:05.625873] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.939 [2024-04-27 06:50:05.642191] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:35.939 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.939 INFO: Seed: 4238548106 00:07:35.939 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:35.939 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:35.939 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:35.939 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.939 #2 INITED exec/s: 0 rss: 59Mb 00:07:35.939 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.939 This may also happen if the target rejected all inputs we tried so far 00:07:35.939 [2024-04-27 06:50:05.708900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:35.939 [2024-04-27 06:50:05.708937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.939 [2024-04-27 06:50:05.709059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:35.939 [2024-04-27 06:50:05.709077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.940 [2024-04-27 06:50:05.709187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:35.940 [2024-04-27 06:50:05.709203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.940 [2024-04-27 06:50:05.709316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:35.940 [2024-04-27 06:50:05.709333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.940 [2024-04-27 06:50:05.709445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:35.940 [2024-04-27 06:50:05.709462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.198 NEW_FUNC[1/662]: 0x4a8060 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:36.198 NEW_FUNC[2/662]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:36.198 #3 NEW cov: 11438 ft: 11433 corp: 2/11b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:36.198 [2024-04-27 06:50:06.038853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:36.198 [2024-04-27 06:50:06.038892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.198 #5 NEW cov: 11551 ft: 12391 corp: 3/13b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 2 ShuffleBytes-CopyPart- 00:07:36.198 [2024-04-27 06:50:06.078901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e44 cdw11:00000000 00:07:36.198 [2024-04-27 06:50:06.078933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.457 #8 NEW cov: 11557 ft: 12605 corp: 4/15b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 3 EraseBytes-ChangeBit-InsertByte- 00:07:36.457 [2024-04-27 06:50:06.118990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e44 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.119016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.457 #9 NEW cov: 11642 ft: 12866 corp: 5/17b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 CopyPart- 00:07:36.457 [2024-04-27 06:50:06.159158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002844 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.159185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.457 #10 NEW cov: 11642 ft: 12895 corp: 6/19b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 ChangeByte- 00:07:36.457 [2024-04-27 06:50:06.199885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.199912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.457 [2024-04-27 06:50:06.200022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.200038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.457 [2024-04-27 06:50:06.200141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.200157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.457 [2024-04-27 06:50:06.200265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.200281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.457 #11 NEW cov: 11642 ft: 12962 corp: 7/28b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:36.457 [2024-04-27 06:50:06.239451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.239479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.457 #12 NEW cov: 11642 ft: 12997 corp: 8/31b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 1 CrossOver- 00:07:36.457 [2024-04-27 06:50:06.279544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.279572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.457 #13 NEW cov: 11642 ft: 13018 corp: 9/33b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:36.457 [2024-04-27 06:50:06.320307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.320337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.457 [2024-04-27 06:50:06.320456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.320473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.457 [2024-04-27 06:50:06.320594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.320611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.457 [2024-04-27 06:50:06.320744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.457 [2024-04-27 06:50:06.320762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.457 #14 NEW cov: 11642 ft: 13042 corp: 10/42b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:36.716 [2024-04-27 06:50:06.359855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.359882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.716 #15 NEW cov: 11642 ft: 13108 corp: 11/44b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ChangeBit- 00:07:36.716 [2024-04-27 06:50:06.399902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e44 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.399930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.716 #16 NEW cov: 11642 ft: 13123 corp: 12/46b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:36.716 [2024-04-27 06:50:06.440029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a95 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.440055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.716 #17 NEW cov: 11642 ft: 13149 corp: 13/48b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ChangeByte- 00:07:36.716 [2024-04-27 06:50:06.480736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.480762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.716 [2024-04-27 06:50:06.480882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.480899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.716 [2024-04-27 06:50:06.481008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.481025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.716 [2024-04-27 06:50:06.481146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000044 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.481163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.716 #18 NEW cov: 11642 ft: 13235 corp: 14/56b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:36.716 [2024-04-27 06:50:06.520148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e02 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.520175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.716 #19 NEW cov: 11642 ft: 13245 corp: 15/58b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:36.716 [2024-04-27 06:50:06.560442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002844 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.560468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.716 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:36.716 #20 NEW cov: 11665 ft: 13270 corp: 16/60b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 CopyPart- 00:07:36.716 [2024-04-27 06:50:06.600726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e44 cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.600754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.716 [2024-04-27 06:50:06.600866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:36.716 [2024-04-27 06:50:06.600882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.976 #21 NEW cov: 11665 ft: 13518 corp: 17/64b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 CrossOver- 00:07:36.976 [2024-04-27 06:50:06.640679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a95 cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.640707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.976 #22 NEW cov: 11665 ft: 13554 corp: 18/66b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 CrossOver- 00:07:36.976 [2024-04-27 06:50:06.681435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.681462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.976 [2024-04-27 06:50:06.681570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.681585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.976 [2024-04-27 06:50:06.681694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.681712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.976 [2024-04-27 06:50:06.681827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.681843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.976 #23 NEW cov: 11665 ft: 13628 corp: 19/75b lim: 10 exec/s: 23 rss: 69Mb L: 9/10 MS: 1 ChangeBit- 00:07:36.976 [2024-04-27 06:50:06.720920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e2d cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.720948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.976 #24 NEW cov: 11665 ft: 13660 corp: 20/78b lim: 10 exec/s: 24 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:07:36.976 [2024-04-27 06:50:06.761197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a95 cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.761224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.976 [2024-04-27 06:50:06.761343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000e2d cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.761360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.976 #25 NEW cov: 11665 ft: 13693 corp: 21/83b lim: 10 exec/s: 25 rss: 69Mb L: 5/10 MS: 1 CrossOver- 00:07:36.976 [2024-04-27 06:50:06.801154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.801181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.976 #26 NEW cov: 11665 ft: 13718 corp: 22/86b lim: 10 exec/s: 26 rss: 69Mb L: 3/10 MS: 1 CrossOver- 00:07:36.976 [2024-04-27 06:50:06.841306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a94 cdw11:00000000 00:07:36.976 [2024-04-27 06:50:06.841333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.976 #27 NEW cov: 11665 ft: 13759 corp: 23/88b lim: 10 exec/s: 27 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:37.237 [2024-04-27 06:50:06.881441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff06 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:06.881469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.237 #28 NEW cov: 11665 ft: 13832 corp: 24/90b lim: 10 exec/s: 28 rss: 69Mb L: 2/10 MS: 1 CMP- DE: "\377\006"- 00:07:37.237 [2024-04-27 06:50:06.921525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff06 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:06.921551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.237 #32 NEW cov: 11665 ft: 13839 corp: 25/93b lim: 10 exec/s: 32 rss: 69Mb L: 3/10 MS: 4 EraseBytes-CrossOver-ChangeBit-PersAutoDict- DE: "\377\006"- 00:07:37.237 [2024-04-27 06:50:06.961678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e94 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:06.961706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.237 #33 NEW cov: 11665 ft: 13841 corp: 26/95b lim: 10 exec/s: 33 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:37.237 [2024-04-27 06:50:07.002488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.002516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.002630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f2c cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.002646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.002763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008c0f cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.002779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.002899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000059f1 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.002915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.003025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000e94 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.003041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.237 #34 NEW cov: 11665 ft: 13845 corp: 27/105b lim: 10 exec/s: 34 rss: 69Mb L: 10/10 MS: 1 CMP- DE: "\001\000\177,\214\017Y\361"- 00:07:37.237 [2024-04-27 06:50:07.042514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.042542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.042658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.042675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.042787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.042804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.042891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.042907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.237 #35 NEW cov: 11665 ft: 13855 corp: 28/114b lim: 10 exec/s: 35 rss: 70Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:37.237 [2024-04-27 06:50:07.082639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a5a5 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.082667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.082775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a5a5 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.082791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.082905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a5a5 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.082922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.237 [2024-04-27 06:50:07.083037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000e94 cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.083055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.237 #36 NEW cov: 11665 ft: 13868 corp: 29/122b lim: 10 exec/s: 36 rss: 70Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:37.237 [2024-04-27 06:50:07.122119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000950a cdw11:00000000 00:07:37.237 [2024-04-27 06:50:07.122146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.498 #37 NEW cov: 11665 ft: 13871 corp: 30/124b lim: 10 exec/s: 37 rss: 70Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:37.498 [2024-04-27 06:50:07.162397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e44 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.162425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.162542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000080a cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.162559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.498 #38 NEW cov: 11665 ft: 13878 corp: 31/128b lim: 10 exec/s: 38 rss: 70Mb L: 4/10 MS: 1 ChangeBit- 00:07:37.498 [2024-04-27 06:50:07.203242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.203269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.203382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.203405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.203519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.203536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.203651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.203669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.203788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000005 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.203802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.498 #39 NEW cov: 11665 ft: 13909 corp: 32/138b lim: 10 exec/s: 39 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:37.498 [2024-04-27 06:50:07.242714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.242743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.242862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff44 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.242879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.498 #40 NEW cov: 11665 ft: 13944 corp: 33/143b lim: 10 exec/s: 40 rss: 70Mb L: 5/10 MS: 1 CrossOver- 00:07:37.498 [2024-04-27 06:50:07.282684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.282711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.498 #41 NEW cov: 11665 ft: 13946 corp: 34/145b lim: 10 exec/s: 41 rss: 70Mb L: 2/10 MS: 1 CrossOver- 00:07:37.498 [2024-04-27 06:50:07.323215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0e cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.323242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.323355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a44 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.323372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.323486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a44 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.323503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.498 #42 NEW cov: 11665 ft: 14096 corp: 35/151b lim: 10 exec/s: 42 rss: 70Mb L: 6/10 MS: 1 CopyPart- 00:07:37.498 [2024-04-27 06:50:07.363513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.363540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.363651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.363667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.363781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.363800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.498 [2024-04-27 06:50:07.363912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000044 cdw11:00000000 00:07:37.498 [2024-04-27 06:50:07.363928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.498 #43 NEW cov: 11665 ft: 14116 corp: 36/159b lim: 10 exec/s: 43 rss: 70Mb L: 8/10 MS: 1 ShuffleBytes- 00:07:37.758 [2024-04-27 06:50:07.403056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff06 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.403083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.758 #44 NEW cov: 11665 ft: 14131 corp: 37/161b lim: 10 exec/s: 44 rss: 70Mb L: 2/10 MS: 1 PersAutoDict- DE: "\377\006"- 00:07:37.758 [2024-04-27 06:50:07.443131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f668 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.443158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.758 #45 NEW cov: 11665 ft: 14144 corp: 38/163b lim: 10 exec/s: 45 rss: 70Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:37.758 [2024-04-27 06:50:07.483284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.483311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.758 #46 NEW cov: 11665 ft: 14201 corp: 39/165b lim: 10 exec/s: 46 rss: 70Mb L: 2/10 MS: 1 CopyPart- 00:07:37.758 [2024-04-27 06:50:07.523955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000400 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.523980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.524099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.524115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.524222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.524238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.524351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.524367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.758 #47 NEW cov: 11665 ft: 14202 corp: 40/174b lim: 10 exec/s: 47 rss: 70Mb L: 9/10 MS: 1 CMP- DE: "\004\000"- 00:07:37.758 [2024-04-27 06:50:07.564224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.564251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.564366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f2c cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.564383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.564503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008c0f cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.564518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.564630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000590a cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.564649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.564760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000094 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.564776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.758 #48 NEW cov: 11665 ft: 14210 corp: 41/184b lim: 10 exec/s: 48 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:37.758 [2024-04-27 06:50:07.603884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e44 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.603913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.604030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.604046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.604163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.604179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.758 #49 NEW cov: 11665 ft: 14241 corp: 42/190b lim: 10 exec/s: 49 rss: 70Mb L: 6/10 MS: 1 CrossOver- 00:07:37.758 [2024-04-27 06:50:07.644331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a5a5 cdw11:00000000 00:07:37.758 [2024-04-27 06:50:07.644360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.758 [2024-04-27 06:50:07.644489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a5f4 cdw11:00000000 00:07:37.759 [2024-04-27 06:50:07.644507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.759 [2024-04-27 06:50:07.644614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a5a5 cdw11:00000000 00:07:37.759 [2024-04-27 06:50:07.644630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.759 [2024-04-27 06:50:07.644746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000a50e cdw11:00000000 00:07:37.759 [2024-04-27 06:50:07.644763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.018 #50 NEW cov: 11665 ft: 14262 corp: 43/199b lim: 10 exec/s: 50 rss: 70Mb L: 9/10 MS: 1 InsertByte- 00:07:38.018 [2024-04-27 06:50:07.683940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000400 cdw11:00000000 00:07:38.018 [2024-04-27 06:50:07.683967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.018 #51 NEW cov: 11665 ft: 14269 corp: 44/201b lim: 10 exec/s: 25 rss: 70Mb L: 2/10 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:38.018 #51 DONE cov: 11665 ft: 14269 corp: 44/201b lim: 10 exec/s: 25 rss: 70Mb 00:07:38.018 ###### Recommended dictionary. ###### 00:07:38.018 "\377\006" # Uses: 2 00:07:38.018 "\001\000\177,\214\017Y\361" # Uses: 0 00:07:38.018 "\004\000" # Uses: 1 00:07:38.018 ###### End of recommended dictionary. ###### 00:07:38.018 Done 51 runs in 2 second(s) 00:07:38.018 06:50:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:38.018 06:50:07 -- ../common.sh@72 -- # (( i++ )) 00:07:38.018 06:50:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.018 06:50:07 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:38.018 06:50:07 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:38.018 06:50:07 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.018 06:50:07 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.018 06:50:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:38.018 06:50:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:38.018 06:50:07 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:38.018 06:50:07 -- nvmf/run.sh@29 -- # port=4407 00:07:38.018 06:50:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:38.018 06:50:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:38.018 06:50:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.018 06:50:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:38.018 [2024-04-27 06:50:07.865931] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:38.018 [2024-04-27 06:50:07.866023] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2621603 ] 00:07:38.018 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.278 [2024-04-27 06:50:08.042717] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.278 [2024-04-27 06:50:08.061965] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:38.278 [2024-04-27 06:50:08.062087] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.278 [2024-04-27 06:50:08.113413] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.278 [2024-04-27 06:50:08.129725] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:38.278 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.278 INFO: Seed: 2432565519 00:07:38.278 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:38.278 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:38.278 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:38.278 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.278 #2 INITED exec/s: 0 rss: 59Mb 00:07:38.278 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.278 This may also happen if the target rejected all inputs we tried so far 00:07:38.537 [2024-04-27 06:50:08.174957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000300a cdw11:00000000 00:07:38.537 [2024-04-27 06:50:08.174984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.797 NEW_FUNC[1/662]: 0x4a8a50 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:38.797 NEW_FUNC[2/662]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.797 #8 NEW cov: 11438 ft: 11439 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:07:38.797 [2024-04-27 06:50:08.476018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.476049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.476102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.476116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.476170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.476186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.476236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000030 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.476249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.797 #11 NEW cov: 11551 ft: 12082 corp: 3/11b lim: 10 exec/s: 0 rss: 67Mb L: 8/8 MS: 3 EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:38.797 [2024-04-27 06:50:08.525839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.525865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.525934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.525948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.797 #12 NEW cov: 11557 ft: 12462 corp: 4/16b lim: 10 exec/s: 0 rss: 67Mb L: 5/8 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:38.797 [2024-04-27 06:50:08.566166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.566191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.566256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.566270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.566320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.566333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.566383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000030 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.566401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.797 #13 NEW cov: 11642 ft: 12716 corp: 5/24b lim: 10 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 ChangeByte- 00:07:38.797 [2024-04-27 06:50:08.606278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.606302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.606353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.606367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.606417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.606431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.606479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000030 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.606492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.797 #14 NEW cov: 11642 ft: 12802 corp: 6/32b lim: 10 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:38.797 [2024-04-27 06:50:08.646331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.646356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.646411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.646425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.646475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.646488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.646538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000030 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.646552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.797 #15 NEW cov: 11642 ft: 12913 corp: 7/40b lim: 10 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 ChangeASCIIInt- 00:07:38.797 [2024-04-27 06:50:08.686497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.686522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.686588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.686601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.686651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.797 [2024-04-27 06:50:08.686664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.797 [2024-04-27 06:50:08.686715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.798 [2024-04-27 06:50:08.686728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.057 #16 NEW cov: 11642 ft: 12979 corp: 8/48b lim: 10 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:39.057 [2024-04-27 06:50:08.726705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a10 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.726731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.726781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.726794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.726842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.726871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.726921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.726934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.726984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.726997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.057 #19 NEW cov: 11642 ft: 13077 corp: 9/58b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:39.057 [2024-04-27 06:50:08.766845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a10 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.766870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.766921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.766934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.766984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.766997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.767046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.767059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.767108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00003010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.767122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.057 #20 NEW cov: 11642 ft: 13105 corp: 10/68b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:07:39.057 [2024-04-27 06:50:08.806774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000300a cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.806799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.806850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.806863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.806911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.806924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.806973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.806987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.057 #21 NEW cov: 11642 ft: 13160 corp: 11/77b lim: 10 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 CrossOver- 00:07:39.057 [2024-04-27 06:50:08.846683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f6fe cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.846708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.846762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.846776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.057 #22 NEW cov: 11642 ft: 13246 corp: 12/82b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 ChangeBinInt- 00:07:39.057 [2024-04-27 06:50:08.887165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000300a cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.887193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.887248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.887261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.887313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001000 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.887327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.057 [2024-04-27 06:50:08.887379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.057 [2024-04-27 06:50:08.887392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.058 [2024-04-27 06:50:08.887451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.058 [2024-04-27 06:50:08.887464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.058 #23 NEW cov: 11642 ft: 13341 corp: 13/92b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:07:39.058 [2024-04-27 06:50:08.926915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f6fe cdw11:00000000 00:07:39.058 [2024-04-27 06:50:08.926941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.058 [2024-04-27 06:50:08.926994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:39.058 [2024-04-27 06:50:08.927008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.317 #24 NEW cov: 11642 ft: 13390 corp: 14/97b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 CrossOver- 00:07:39.317 [2024-04-27 06:50:08.967128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.317 [2024-04-27 06:50:08.967154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:08.967207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 00:07:39.317 [2024-04-27 06:50:08.967221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:08.967273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.317 [2024-04-27 06:50:08.967286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.317 #25 NEW cov: 11642 ft: 13536 corp: 15/103b lim: 10 exec/s: 0 rss: 69Mb L: 6/10 MS: 1 EraseBytes- 00:07:39.317 [2024-04-27 06:50:09.007528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000300a cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.007555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:09.007607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.007621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:09.007671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.007684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:09.007739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.007752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:09.007803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.007817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.317 #26 NEW cov: 11642 ft: 13561 corp: 16/113b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:39.317 [2024-04-27 06:50:09.047153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f6fe cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.047179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.317 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:39.317 #27 NEW cov: 11665 ft: 13655 corp: 17/116b lim: 10 exec/s: 0 rss: 69Mb L: 3/10 MS: 1 EraseBytes- 00:07:39.317 [2024-04-27 06:50:09.087771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a10 cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.087796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:09.087848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f0ef cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.087861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:09.087912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000efef cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.087925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:09.087976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000efef cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.087989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.317 [2024-04-27 06:50:09.088039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000cfef cdw11:00000000 00:07:39.317 [2024-04-27 06:50:09.088052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.317 #28 NEW cov: 11665 ft: 13680 corp: 18/126b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:39.317 [2024-04-27 06:50:09.127798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f6fe cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.127822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.127874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.127887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.127937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.127950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.128001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.128013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.318 #29 NEW cov: 11665 ft: 13692 corp: 19/135b lim: 10 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:39.318 [2024-04-27 06:50:09.167774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f601 cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.167799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.167850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000feff cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.167863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.167914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fe30 cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.167927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.318 #30 NEW cov: 11665 ft: 13712 corp: 20/141b lim: 10 exec/s: 30 rss: 70Mb L: 6/10 MS: 1 CrossOver- 00:07:39.318 [2024-04-27 06:50:09.208094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.208119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.208184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.208198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.208247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.208260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.208309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.208322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.318 [2024-04-27 06:50:09.208373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.318 [2024-04-27 06:50:09.208386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.580 #31 NEW cov: 11665 ft: 13731 corp: 21/151b lim: 10 exec/s: 31 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:39.580 [2024-04-27 06:50:09.248184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a10 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.248208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.580 [2024-04-27 06:50:09.248260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.248273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.580 [2024-04-27 06:50:09.248323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001000 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.248337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.580 [2024-04-27 06:50:09.248388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.248404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.580 [2024-04-27 06:50:09.248456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000a10 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.248469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.580 #32 NEW cov: 11665 ft: 13792 corp: 22/161b lim: 10 exec/s: 32 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:39.580 [2024-04-27 06:50:09.288215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.288239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.580 [2024-04-27 06:50:09.288292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.288305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.580 [2024-04-27 06:50:09.288354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.288384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.580 [2024-04-27 06:50:09.288436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000030 cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.288450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.580 #33 NEW cov: 11665 ft: 13843 corp: 23/169b lim: 10 exec/s: 33 rss: 70Mb L: 8/10 MS: 1 ShuffleBytes- 00:07:39.580 [2024-04-27 06:50:09.328076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f63b cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.328101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.580 [2024-04-27 06:50:09.328152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.328166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.580 #34 NEW cov: 11665 ft: 13864 corp: 24/174b lim: 10 exec/s: 34 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:39.580 [2024-04-27 06:50:09.368085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a1a cdw11:00000000 00:07:39.580 [2024-04-27 06:50:09.368110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.580 #37 NEW cov: 11665 ft: 13895 corp: 25/176b lim: 10 exec/s: 37 rss: 70Mb L: 2/10 MS: 3 ChangeBit-ShuffleBytes-CopyPart- 00:07:39.580 [2024-04-27 06:50:09.398655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000100a cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.398680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.581 [2024-04-27 06:50:09.398732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.398746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.581 [2024-04-27 06:50:09.398798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001000 cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.398812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.581 [2024-04-27 06:50:09.398864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.398877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.581 [2024-04-27 06:50:09.398936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.398949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.581 #38 NEW cov: 11665 ft: 13947 corp: 26/186b lim: 10 exec/s: 38 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:39.581 [2024-04-27 06:50:09.438828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.438853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.581 [2024-04-27 06:50:09.438902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.438915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.581 [2024-04-27 06:50:09.438966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.438979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.581 [2024-04-27 06:50:09.439031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.439044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.581 [2024-04-27 06:50:09.439096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001010 cdw11:00000000 00:07:39.581 [2024-04-27 06:50:09.439110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.581 #39 NEW cov: 11665 ft: 13951 corp: 27/196b lim: 10 exec/s: 39 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:39.843 [2024-04-27 06:50:09.478778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.478803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.478870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.478884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.478934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.478946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.478997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000dc30 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.479011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.843 #40 NEW cov: 11665 ft: 13976 corp: 28/205b lim: 10 exec/s: 40 rss: 70Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:39.843 [2024-04-27 06:50:09.518790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.518815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.518869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.518882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.518933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.518949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.843 #41 NEW cov: 11665 ft: 13982 corp: 29/211b lim: 10 exec/s: 41 rss: 70Mb L: 6/10 MS: 1 ChangeBit- 00:07:39.843 [2024-04-27 06:50:09.558882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f601 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.558906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.558961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000feff cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.558974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.559041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000be30 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.559055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.843 #42 NEW cov: 11665 ft: 13993 corp: 30/217b lim: 10 exec/s: 42 rss: 70Mb L: 6/10 MS: 1 ChangeBit- 00:07:39.843 [2024-04-27 06:50:09.599144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000002e cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.599168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.599220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.599234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.599284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.599297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.599347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.599359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.843 #43 NEW cov: 11665 ft: 13998 corp: 31/226b lim: 10 exec/s: 43 rss: 70Mb L: 9/10 MS: 1 InsertByte- 00:07:39.843 [2024-04-27 06:50:09.639000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.639023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.639077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.639090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.843 #44 NEW cov: 11665 ft: 14011 corp: 32/231b lim: 10 exec/s: 44 rss: 70Mb L: 5/10 MS: 1 EraseBytes- 00:07:39.843 [2024-04-27 06:50:09.679368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.679392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.679450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000041 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.679463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.679515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.679531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.679583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000030 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.679596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.843 #45 NEW cov: 11665 ft: 14026 corp: 33/239b lim: 10 exec/s: 45 rss: 70Mb L: 8/10 MS: 1 ChangeByte- 00:07:39.843 [2024-04-27 06:50:09.719276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f6ff cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.719301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.843 [2024-04-27 06:50:09.719352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fe30 cdw11:00000000 00:07:39.843 [2024-04-27 06:50:09.719365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.843 #46 NEW cov: 11665 ft: 14034 corp: 34/243b lim: 10 exec/s: 46 rss: 70Mb L: 4/10 MS: 1 EraseBytes- 00:07:40.103 [2024-04-27 06:50:09.749616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f601 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.749641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.749693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fe08 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.749707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.749757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.749770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.749822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fe30 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.749835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.103 #47 NEW cov: 11665 ft: 14040 corp: 35/251b lim: 10 exec/s: 47 rss: 70Mb L: 8/10 MS: 1 CMP- DE: "\010\000"- 00:07:40.103 [2024-04-27 06:50:09.789495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.789519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.789571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000030 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.789584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.103 #48 NEW cov: 11665 ft: 14053 corp: 36/255b lim: 10 exec/s: 48 rss: 70Mb L: 4/10 MS: 1 EraseBytes- 00:07:40.103 [2024-04-27 06:50:09.829795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000300a cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.829819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.829871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.829885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.829935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001000 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.829967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.830018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001010 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.830031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.859806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003010 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.859830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.859881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000010 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.859895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.859944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001010 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.859956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.103 #50 NEW cov: 11665 ft: 14058 corp: 37/261b lim: 10 exec/s: 50 rss: 70Mb L: 6/10 MS: 2 EraseBytes-EraseBytes- 00:07:40.103 [2024-04-27 06:50:09.899908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.899934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.899986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000202d cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.899999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.900049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.900063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.103 #51 NEW cov: 11665 ft: 14063 corp: 38/267b lim: 10 exec/s: 51 rss: 70Mb L: 6/10 MS: 1 ChangeBit- 00:07:40.103 [2024-04-27 06:50:09.939751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000efef cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.939775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 #52 NEW cov: 11665 ft: 14072 corp: 39/270b lim: 10 exec/s: 52 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:40.103 [2024-04-27 06:50:09.980285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.980309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.980361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000060dc cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.980376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.980430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.980443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.980493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.980509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.103 [2024-04-27 06:50:09.980559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000300a cdw11:00000000 00:07:40.103 [2024-04-27 06:50:09.980572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:40.365 #53 NEW cov: 11665 ft: 14081 corp: 40/280b lim: 10 exec/s: 53 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:07:40.365 [2024-04-27 06:50:10.020478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000100a cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.020504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.020561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.020576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.020629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001000 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.020642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.020697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001010 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.020711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.020763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001010 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.020778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:40.365 #54 NEW cov: 11665 ft: 14114 corp: 41/290b lim: 10 exec/s: 54 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:40.365 [2024-04-27 06:50:10.060442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.060469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.060523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.060539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.060591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.060604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.060654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002800 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.060668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.365 #55 NEW cov: 11665 ft: 14147 corp: 42/299b lim: 10 exec/s: 55 rss: 70Mb L: 9/10 MS: 1 InsertByte- 00:07:40.365 [2024-04-27 06:50:10.100704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006f6f cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.100733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.100808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006f6f cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.100823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.100887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006f6f cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.100902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.100953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006f6f cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.100968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.101019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006f0b cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.101034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:40.365 #57 NEW cov: 11665 ft: 14156 corp: 43/309b lim: 10 exec/s: 57 rss: 70Mb L: 10/10 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:40.365 [2024-04-27 06:50:10.140548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000000eb cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.140574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.140628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.140642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.140691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.140705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.365 #58 NEW cov: 11665 ft: 14158 corp: 44/315b lim: 10 exec/s: 58 rss: 70Mb L: 6/10 MS: 1 ChangeByte- 00:07:40.365 [2024-04-27 06:50:10.180935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.180960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.181011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000060dc cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.181025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.181073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.181087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.181137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000dcdc cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.181150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.365 [2024-04-27 06:50:10.181202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000350a cdw11:00000000 00:07:40.365 [2024-04-27 06:50:10.181216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:40.365 #59 NEW cov: 11665 ft: 14171 corp: 45/325b lim: 10 exec/s: 29 rss: 70Mb L: 10/10 MS: 1 ChangeASCIIInt- 00:07:40.365 #59 DONE cov: 11665 ft: 14171 corp: 45/325b lim: 10 exec/s: 29 rss: 70Mb 00:07:40.365 ###### Recommended dictionary. ###### 00:07:40.365 "\001\000\000\000" # Uses: 3 00:07:40.365 "\010\000" # Uses: 0 00:07:40.365 ###### End of recommended dictionary. ###### 00:07:40.365 Done 59 runs in 2 second(s) 00:07:40.625 06:50:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:40.625 06:50:10 -- ../common.sh@72 -- # (( i++ )) 00:07:40.625 06:50:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.625 06:50:10 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:40.625 06:50:10 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:40.625 06:50:10 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.625 06:50:10 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.625 06:50:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:40.625 06:50:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:40.625 06:50:10 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:40.625 06:50:10 -- nvmf/run.sh@29 -- # port=4408 00:07:40.625 06:50:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:40.625 06:50:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:40.625 06:50:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.625 06:50:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:40.625 [2024-04-27 06:50:10.362410] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:40.625 [2024-04-27 06:50:10.362507] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2622028 ] 00:07:40.625 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.884 [2024-04-27 06:50:10.548571] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.884 [2024-04-27 06:50:10.568448] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.884 [2024-04-27 06:50:10.568578] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.884 [2024-04-27 06:50:10.620363] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.884 [2024-04-27 06:50:10.636681] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:40.884 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.884 INFO: Seed: 644614200 00:07:40.884 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:40.884 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:40.884 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:40.884 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.884 [2024-04-27 06:50:10.692001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.884 [2024-04-27 06:50:10.692032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.884 #2 INITED cov: 11466 ft: 11467 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:40.884 [2024-04-27 06:50:10.721909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.884 [2024-04-27 06:50:10.721935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.884 #3 NEW cov: 11579 ft: 12083 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 CrossOver- 00:07:40.884 [2024-04-27 06:50:10.762032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.884 [2024-04-27 06:50:10.762058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.144 #4 NEW cov: 11585 ft: 12275 corp: 3/3b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeBit- 00:07:41.144 [2024-04-27 06:50:10.802198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.802223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.144 #5 NEW cov: 11670 ft: 12651 corp: 4/4b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:41.144 [2024-04-27 06:50:10.842265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.842290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.144 #6 NEW cov: 11670 ft: 12741 corp: 5/5b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:41.144 [2024-04-27 06:50:10.882571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.882596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.144 [2024-04-27 06:50:10.882655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.882669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.144 #7 NEW cov: 11670 ft: 13452 corp: 6/7b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:41.144 [2024-04-27 06:50:10.922978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.923004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.144 [2024-04-27 06:50:10.923078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.923093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.144 [2024-04-27 06:50:10.923150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.923164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.144 [2024-04-27 06:50:10.923220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.923233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.144 #8 NEW cov: 11670 ft: 13883 corp: 7/11b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:41.144 [2024-04-27 06:50:10.973128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.973153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.144 [2024-04-27 06:50:10.973210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.973224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.144 [2024-04-27 06:50:10.973283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.973299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.144 [2024-04-27 06:50:10.973354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:10.973367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.144 #9 NEW cov: 11670 ft: 13908 corp: 8/15b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ShuffleBytes- 00:07:41.144 [2024-04-27 06:50:11.012914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:11.012940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.144 [2024-04-27 06:50:11.012996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.144 [2024-04-27 06:50:11.013010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.405 #10 NEW cov: 11670 ft: 13973 corp: 9/17b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 ChangeBinInt- 00:07:41.405 [2024-04-27 06:50:11.053522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.053548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.053604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.053618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.053673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.053686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.053741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.053754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.053808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.053821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.405 #11 NEW cov: 11670 ft: 14059 corp: 10/22b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertByte- 00:07:41.405 [2024-04-27 06:50:11.092958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.092983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.405 #12 NEW cov: 11670 ft: 14067 corp: 11/23b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:41.405 [2024-04-27 06:50:11.133559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.133583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.133648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.133661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.133732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.133746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.133804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.133817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.405 #13 NEW cov: 11670 ft: 14101 corp: 12/27b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 ChangeByte- 00:07:41.405 [2024-04-27 06:50:11.173682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.173706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.173763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.173777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.173833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.173846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.405 [2024-04-27 06:50:11.173899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.405 [2024-04-27 06:50:11.173912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.405 #14 NEW cov: 11670 ft: 14197 corp: 13/31b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 ChangeByte- 00:07:41.406 [2024-04-27 06:50:11.213647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.213673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.406 [2024-04-27 06:50:11.213735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.213750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.406 [2024-04-27 06:50:11.213806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.213819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.406 #15 NEW cov: 11670 ft: 14367 corp: 14/34b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 EraseBytes- 00:07:41.406 [2024-04-27 06:50:11.253913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.253938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.406 [2024-04-27 06:50:11.253997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.254012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.406 [2024-04-27 06:50:11.254067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.254080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.406 [2024-04-27 06:50:11.254135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.254149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.406 #16 NEW cov: 11670 ft: 14378 corp: 15/38b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:41.406 [2024-04-27 06:50:11.293693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.293719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.406 [2024-04-27 06:50:11.293775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.406 [2024-04-27 06:50:11.293790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.666 #17 NEW cov: 11670 ft: 14396 corp: 16/40b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:07:41.666 [2024-04-27 06:50:11.333688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.333713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.666 #18 NEW cov: 11670 ft: 14409 corp: 17/41b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:07:41.666 [2024-04-27 06:50:11.363763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.363789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.666 #19 NEW cov: 11670 ft: 14445 corp: 18/42b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 CopyPart- 00:07:41.666 [2024-04-27 06:50:11.393839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.393863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.666 #20 NEW cov: 11670 ft: 14529 corp: 19/43b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:07:41.666 [2024-04-27 06:50:11.433976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.434001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.666 #21 NEW cov: 11670 ft: 14559 corp: 20/44b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:07:41.666 [2024-04-27 06:50:11.474720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.474745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.666 [2024-04-27 06:50:11.474806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.474820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.666 [2024-04-27 06:50:11.474877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.474891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.666 [2024-04-27 06:50:11.474949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.474962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.666 [2024-04-27 06:50:11.475018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.475032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.666 #22 NEW cov: 11670 ft: 14600 corp: 21/49b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CopyPart- 00:07:41.666 [2024-04-27 06:50:11.514372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.514400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.666 [2024-04-27 06:50:11.514458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.514472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.666 #23 NEW cov: 11670 ft: 14616 corp: 22/51b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:41.666 [2024-04-27 06:50:11.554297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.666 [2024-04-27 06:50:11.554323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.185 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:42.185 #24 NEW cov: 11693 ft: 14629 corp: 23/52b lim: 5 exec/s: 24 rss: 69Mb L: 1/5 MS: 1 CrossOver- 00:07:42.185 [2024-04-27 06:50:11.855207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.185 [2024-04-27 06:50:11.855241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.185 [2024-04-27 06:50:11.855312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.185 [2024-04-27 06:50:11.855326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.185 #25 NEW cov: 11693 ft: 14637 corp: 24/54b lim: 5 exec/s: 25 rss: 69Mb L: 2/5 MS: 1 EraseBytes- 00:07:42.185 [2024-04-27 06:50:11.895677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.185 [2024-04-27 06:50:11.895704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.185 [2024-04-27 06:50:11.895762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.185 [2024-04-27 06:50:11.895776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.185 [2024-04-27 06:50:11.895829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.185 [2024-04-27 06:50:11.895843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.185 [2024-04-27 06:50:11.895896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.185 [2024-04-27 06:50:11.895910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.185 [2024-04-27 06:50:11.895962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.185 [2024-04-27 06:50:11.895975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.185 #26 NEW cov: 11693 ft: 14649 corp: 25/59b lim: 5 exec/s: 26 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:42.185 [2024-04-27 06:50:11.935519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:11.935545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:11.935601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:11.935615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:11.935669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:11.935685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.186 #27 NEW cov: 11693 ft: 14664 corp: 26/62b lim: 5 exec/s: 27 rss: 69Mb L: 3/5 MS: 1 EraseBytes- 00:07:42.186 [2024-04-27 06:50:11.975777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:11.975802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:11.975857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:11.975870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:11.975925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:11.975938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:11.975991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:11.976004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.186 #28 NEW cov: 11693 ft: 14707 corp: 27/66b lim: 5 exec/s: 28 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:42.186 [2024-04-27 06:50:12.016054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.016079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:12.016148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.016164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:12.016217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.016230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:12.016285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.016298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:12.016353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.016366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.186 #29 NEW cov: 11693 ft: 14743 corp: 28/71b lim: 5 exec/s: 29 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:42.186 [2024-04-27 06:50:12.056126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.056151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:12.056206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.056219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:12.056272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.056285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:12.056338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.056350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.186 [2024-04-27 06:50:12.056407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.186 [2024-04-27 06:50:12.056420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.446 #30 NEW cov: 11693 ft: 14775 corp: 29/76b lim: 5 exec/s: 30 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:42.446 [2024-04-27 06:50:12.095688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.095713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.446 #31 NEW cov: 11693 ft: 14787 corp: 30/77b lim: 5 exec/s: 31 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:42.446 [2024-04-27 06:50:12.125740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.125765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.446 #32 NEW cov: 11693 ft: 14794 corp: 31/78b lim: 5 exec/s: 32 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:42.446 [2024-04-27 06:50:12.155850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.155875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.446 #33 NEW cov: 11693 ft: 14801 corp: 32/79b lim: 5 exec/s: 33 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:42.446 [2024-04-27 06:50:12.186383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.186411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.186464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.186478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.186530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.186544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.186595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.186608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.446 #34 NEW cov: 11693 ft: 14813 corp: 33/83b lim: 5 exec/s: 34 rss: 70Mb L: 4/5 MS: 1 ShuffleBytes- 00:07:42.446 [2024-04-27 06:50:12.226501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.226526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.226580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.226594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.226649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.226662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.226713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.226726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.446 #35 NEW cov: 11693 ft: 14830 corp: 34/87b lim: 5 exec/s: 35 rss: 70Mb L: 4/5 MS: 1 CrossOver- 00:07:42.446 [2024-04-27 06:50:12.266639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.266666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.266718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.266731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.266783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.266796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.266848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.266861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.446 #36 NEW cov: 11693 ft: 14888 corp: 35/91b lim: 5 exec/s: 36 rss: 70Mb L: 4/5 MS: 1 CrossOver- 00:07:42.446 [2024-04-27 06:50:12.306779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.306803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.306859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.306873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.306926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.306940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.446 [2024-04-27 06:50:12.306991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.446 [2024-04-27 06:50:12.307004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.446 #37 NEW cov: 11693 ft: 14911 corp: 36/95b lim: 5 exec/s: 37 rss: 70Mb L: 4/5 MS: 1 ChangeByte- 00:07:42.708 [2024-04-27 06:50:12.347014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.347040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.347095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.347108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.347161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.347174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.347226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.347243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.347298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.347313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.708 #38 NEW cov: 11693 ft: 14933 corp: 37/100b lim: 5 exec/s: 38 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:42.708 [2024-04-27 06:50:12.386996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.387021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.387075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.387088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.387139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.387168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.387221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.387234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.708 #39 NEW cov: 11693 ft: 14946 corp: 38/104b lim: 5 exec/s: 39 rss: 70Mb L: 4/5 MS: 1 ChangeBit- 00:07:42.708 [2024-04-27 06:50:12.427214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.427240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.427294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.427309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.427362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.427375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.427430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.427443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.427497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.427511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.708 #40 NEW cov: 11693 ft: 14959 corp: 39/109b lim: 5 exec/s: 40 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:42.708 [2024-04-27 06:50:12.467212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.467237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.467294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.467307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.467361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.467374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.708 [2024-04-27 06:50:12.467429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.467442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.708 #41 NEW cov: 11693 ft: 15000 corp: 40/113b lim: 5 exec/s: 41 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:42.708 [2024-04-27 06:50:12.506839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.708 [2024-04-27 06:50:12.506864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.708 #42 NEW cov: 11693 ft: 15025 corp: 41/114b lim: 5 exec/s: 42 rss: 70Mb L: 1/5 MS: 1 CrossOver- 00:07:42.709 [2024-04-27 06:50:12.547603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.547627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.709 [2024-04-27 06:50:12.547682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.547695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.709 [2024-04-27 06:50:12.547752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.547765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.709 [2024-04-27 06:50:12.547818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.547830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.709 [2024-04-27 06:50:12.547884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.547897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.709 #43 NEW cov: 11693 ft: 15031 corp: 42/119b lim: 5 exec/s: 43 rss: 70Mb L: 5/5 MS: 1 InsertByte- 00:07:42.709 [2024-04-27 06:50:12.587767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.587791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.709 [2024-04-27 06:50:12.587850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.587863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.709 [2024-04-27 06:50:12.587917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.587930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.709 [2024-04-27 06:50:12.587982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.587996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.709 [2024-04-27 06:50:12.588049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.709 [2024-04-27 06:50:12.588061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.969 #44 NEW cov: 11693 ft: 15083 corp: 43/124b lim: 5 exec/s: 44 rss: 70Mb L: 5/5 MS: 1 ChangeByte- 00:07:42.969 [2024-04-27 06:50:12.627548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.969 [2024-04-27 06:50:12.627572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.969 [2024-04-27 06:50:12.627625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.969 [2024-04-27 06:50:12.627639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.969 [2024-04-27 06:50:12.627693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.969 [2024-04-27 06:50:12.627706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.969 #45 NEW cov: 11693 ft: 15091 corp: 44/127b lim: 5 exec/s: 45 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:07:42.969 [2024-04-27 06:50:12.667958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.969 [2024-04-27 06:50:12.667982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.969 [2024-04-27 06:50:12.668032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.969 [2024-04-27 06:50:12.668045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.969 [2024-04-27 06:50:12.668098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.969 [2024-04-27 06:50:12.668127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.969 [2024-04-27 06:50:12.668179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.969 [2024-04-27 06:50:12.668192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.969 [2024-04-27 06:50:12.668245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.969 [2024-04-27 06:50:12.668259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.969 #46 NEW cov: 11693 ft: 15099 corp: 45/132b lim: 5 exec/s: 23 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:42.969 #46 DONE cov: 11693 ft: 15099 corp: 45/132b lim: 5 exec/s: 23 rss: 70Mb 00:07:42.969 Done 46 runs in 2 second(s) 00:07:42.969 06:50:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:42.969 06:50:12 -- ../common.sh@72 -- # (( i++ )) 00:07:42.969 06:50:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.969 06:50:12 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:42.969 06:50:12 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:42.969 06:50:12 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.969 06:50:12 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.969 06:50:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:42.969 06:50:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:42.969 06:50:12 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:42.969 06:50:12 -- nvmf/run.sh@29 -- # port=4409 00:07:42.969 06:50:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:42.969 06:50:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:42.969 06:50:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.969 06:50:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:42.969 [2024-04-27 06:50:12.850843] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:42.969 [2024-04-27 06:50:12.850938] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2622428 ] 00:07:43.232 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.232 [2024-04-27 06:50:13.029951] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.232 [2024-04-27 06:50:13.049289] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:43.232 [2024-04-27 06:50:13.049423] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.232 [2024-04-27 06:50:13.100838] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.232 [2024-04-27 06:50:13.117159] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:43.491 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.491 INFO: Seed: 3124604487 00:07:43.491 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:43.491 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:43.491 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:43.491 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.491 [2024-04-27 06:50:13.193097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.193135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.491 #2 INITED cov: 11466 ft: 11467 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:43.491 [2024-04-27 06:50:13.233000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.233033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.491 #3 NEW cov: 11579 ft: 12021 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:43.491 [2024-04-27 06:50:13.283517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.283546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.491 [2024-04-27 06:50:13.283672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.283688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.491 #4 NEW cov: 11585 ft: 12899 corp: 3/4b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:07:43.491 [2024-04-27 06:50:13.333789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.333819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.491 [2024-04-27 06:50:13.333942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.333960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.491 #5 NEW cov: 11670 ft: 13111 corp: 4/6b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeBinInt- 00:07:43.491 [2024-04-27 06:50:13.384581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.384625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.491 [2024-04-27 06:50:13.384746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.384765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.491 [2024-04-27 06:50:13.384879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.384896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.491 [2024-04-27 06:50:13.385011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.491 [2024-04-27 06:50:13.385027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.492 [2024-04-27 06:50:13.385149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.492 [2024-04-27 06:50:13.385165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.751 #6 NEW cov: 11670 ft: 13486 corp: 5/11b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:43.752 [2024-04-27 06:50:13.423741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.423769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.752 #7 NEW cov: 11670 ft: 13683 corp: 6/12b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeByte- 00:07:43.752 [2024-04-27 06:50:13.464318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.464346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.752 [2024-04-27 06:50:13.464464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.464482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.752 [2024-04-27 06:50:13.464604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.464620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.752 #8 NEW cov: 11670 ft: 13954 corp: 7/15b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 CrossOver- 00:07:43.752 [2024-04-27 06:50:13.504243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.504271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.752 [2024-04-27 06:50:13.504393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.504416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.752 #9 NEW cov: 11670 ft: 14091 corp: 8/17b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:07:43.752 [2024-04-27 06:50:13.544039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.544068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.752 #10 NEW cov: 11670 ft: 14131 corp: 9/18b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 CrossOver- 00:07:43.752 [2024-04-27 06:50:13.584504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.584534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.752 [2024-04-27 06:50:13.584656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.584673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.752 #11 NEW cov: 11670 ft: 14155 corp: 10/20b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 CopyPart- 00:07:43.752 [2024-04-27 06:50:13.635210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.635239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.752 [2024-04-27 06:50:13.635363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.635382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.752 [2024-04-27 06:50:13.635510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.635531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.752 [2024-04-27 06:50:13.635649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.752 [2024-04-27 06:50:13.635668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.012 #12 NEW cov: 11670 ft: 14168 corp: 11/24b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:44.012 [2024-04-27 06:50:13.674740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.674770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.012 [2024-04-27 06:50:13.674893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.674908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.012 #13 NEW cov: 11670 ft: 14196 corp: 12/26b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:07:44.012 [2024-04-27 06:50:13.714827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.714854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.012 [2024-04-27 06:50:13.714979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.714998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.012 #14 NEW cov: 11670 ft: 14219 corp: 13/28b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:07:44.012 [2024-04-27 06:50:13.755199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.755227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.012 [2024-04-27 06:50:13.755349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.755366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.012 [2024-04-27 06:50:13.755499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.755518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.012 #15 NEW cov: 11670 ft: 14283 corp: 14/31b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:07:44.012 [2024-04-27 06:50:13.794819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.794845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.012 #16 NEW cov: 11670 ft: 14309 corp: 15/32b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:44.012 [2024-04-27 06:50:13.835493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.835523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.012 [2024-04-27 06:50:13.835648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.835666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.012 [2024-04-27 06:50:13.835800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.835818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.012 #17 NEW cov: 11670 ft: 14320 corp: 16/35b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 EraseBytes- 00:07:44.012 [2024-04-27 06:50:13.875320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.875348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.012 [2024-04-27 06:50:13.875479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.012 [2024-04-27 06:50:13.875495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.012 #18 NEW cov: 11670 ft: 14344 corp: 17/37b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:44.272 [2024-04-27 06:50:13.915646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.272 [2024-04-27 06:50:13.915673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.272 [2024-04-27 06:50:13.915805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.272 [2024-04-27 06:50:13.915822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.272 [2024-04-27 06:50:13.915932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.272 [2024-04-27 06:50:13.915951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.272 #19 NEW cov: 11670 ft: 14368 corp: 18/40b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 ChangeBit- 00:07:44.272 [2024-04-27 06:50:13.955787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.272 [2024-04-27 06:50:13.955815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.272 [2024-04-27 06:50:13.955947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.272 [2024-04-27 06:50:13.955965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.273 [2024-04-27 06:50:13.956091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.273 [2024-04-27 06:50:13.956110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.273 #20 NEW cov: 11670 ft: 14380 corp: 19/43b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 ChangeByte- 00:07:44.273 [2024-04-27 06:50:13.995631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.273 [2024-04-27 06:50:13.995662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.273 [2024-04-27 06:50:13.995793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.273 [2024-04-27 06:50:13.995810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.273 #21 NEW cov: 11670 ft: 14394 corp: 20/45b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:44.273 [2024-04-27 06:50:14.036431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.273 [2024-04-27 06:50:14.036458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.273 [2024-04-27 06:50:14.036583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.273 [2024-04-27 06:50:14.036600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.273 [2024-04-27 06:50:14.036725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.273 [2024-04-27 06:50:14.036742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.273 [2024-04-27 06:50:14.036863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.273 [2024-04-27 06:50:14.036879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.273 [2024-04-27 06:50:14.036998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.273 [2024-04-27 06:50:14.037016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.532 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.532 #22 NEW cov: 11693 ft: 14437 corp: 21/50b lim: 5 exec/s: 22 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:44.532 [2024-04-27 06:50:14.346561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.532 [2024-04-27 06:50:14.346604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.532 [2024-04-27 06:50:14.346738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.532 [2024-04-27 06:50:14.346758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.532 [2024-04-27 06:50:14.346877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.532 [2024-04-27 06:50:14.346896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.532 #23 NEW cov: 11693 ft: 14482 corp: 22/53b lim: 5 exec/s: 23 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:44.533 [2024-04-27 06:50:14.396722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.533 [2024-04-27 06:50:14.396759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.533 [2024-04-27 06:50:14.396882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.533 [2024-04-27 06:50:14.396899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.533 #24 NEW cov: 11693 ft: 14547 corp: 23/55b lim: 5 exec/s: 24 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:44.792 [2024-04-27 06:50:14.436625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.792 [2024-04-27 06:50:14.436653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.792 #25 NEW cov: 11693 ft: 14569 corp: 24/56b lim: 5 exec/s: 25 rss: 70Mb L: 1/5 MS: 1 CopyPart- 00:07:44.792 [2024-04-27 06:50:14.477458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.792 [2024-04-27 06:50:14.477486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.792 [2024-04-27 06:50:14.477626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.792 [2024-04-27 06:50:14.477642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.792 [2024-04-27 06:50:14.477766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.792 [2024-04-27 06:50:14.477784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.792 [2024-04-27 06:50:14.477907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.792 [2024-04-27 06:50:14.477923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.792 #26 NEW cov: 11693 ft: 14580 corp: 25/60b lim: 5 exec/s: 26 rss: 70Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:44.792 [2024-04-27 06:50:14.517338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.792 [2024-04-27 06:50:14.517367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.792 [2024-04-27 06:50:14.517484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.792 [2024-04-27 06:50:14.517501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.792 [2024-04-27 06:50:14.517614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.792 [2024-04-27 06:50:14.517631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.792 #27 NEW cov: 11693 ft: 14583 corp: 26/63b lim: 5 exec/s: 27 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:44.793 [2024-04-27 06:50:14.556928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.793 [2024-04-27 06:50:14.556955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.793 #28 NEW cov: 11693 ft: 14590 corp: 27/64b lim: 5 exec/s: 28 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:44.793 [2024-04-27 06:50:14.587803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.793 [2024-04-27 06:50:14.587831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.793 [2024-04-27 06:50:14.587942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.793 [2024-04-27 06:50:14.587960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.793 [2024-04-27 06:50:14.588075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.793 [2024-04-27 06:50:14.588091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.793 [2024-04-27 06:50:14.588207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.793 [2024-04-27 06:50:14.588223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.793 #29 NEW cov: 11693 ft: 14596 corp: 28/68b lim: 5 exec/s: 29 rss: 70Mb L: 4/5 MS: 1 ChangeByte- 00:07:44.793 [2024-04-27 06:50:14.627207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.793 [2024-04-27 06:50:14.627235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.793 #30 NEW cov: 11693 ft: 14659 corp: 29/69b lim: 5 exec/s: 30 rss: 70Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:44.793 [2024-04-27 06:50:14.667304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.793 [2024-04-27 06:50:14.667334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.052 #31 NEW cov: 11693 ft: 14662 corp: 30/70b lim: 5 exec/s: 31 rss: 70Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:45.052 [2024-04-27 06:50:14.707479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.052 [2024-04-27 06:50:14.707509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.052 #32 NEW cov: 11693 ft: 14666 corp: 31/71b lim: 5 exec/s: 32 rss: 70Mb L: 1/5 MS: 1 CrossOver- 00:07:45.052 [2024-04-27 06:50:14.748111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.052 [2024-04-27 06:50:14.748139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.052 [2024-04-27 06:50:14.748261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.052 [2024-04-27 06:50:14.748279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.052 [2024-04-27 06:50:14.748400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.052 [2024-04-27 06:50:14.748417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.052 #33 NEW cov: 11693 ft: 14747 corp: 32/74b lim: 5 exec/s: 33 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:45.052 [2024-04-27 06:50:14.788189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.053 [2024-04-27 06:50:14.788218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.053 [2024-04-27 06:50:14.788334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.053 [2024-04-27 06:50:14.788350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.053 [2024-04-27 06:50:14.788471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.053 [2024-04-27 06:50:14.788487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.053 #34 NEW cov: 11693 ft: 14771 corp: 33/77b lim: 5 exec/s: 34 rss: 70Mb L: 3/5 MS: 1 ChangeBit- 00:07:45.053 [2024-04-27 06:50:14.837820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.053 [2024-04-27 06:50:14.837848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.053 #35 NEW cov: 11693 ft: 14773 corp: 34/78b lim: 5 exec/s: 35 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:45.053 [2024-04-27 06:50:14.877943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.053 [2024-04-27 06:50:14.877971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.053 #36 NEW cov: 11693 ft: 14782 corp: 35/79b lim: 5 exec/s: 36 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:45.053 [2024-04-27 06:50:14.908280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.053 [2024-04-27 06:50:14.908307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.053 [2024-04-27 06:50:14.908433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.053 [2024-04-27 06:50:14.908449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.053 #37 NEW cov: 11693 ft: 14784 corp: 36/81b lim: 5 exec/s: 37 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:45.312 [2024-04-27 06:50:14.948990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.312 [2024-04-27 06:50:14.949019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:14.949134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:14.949152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:14.949271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:14.949289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:14.949415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:14.949438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.313 #38 NEW cov: 11693 ft: 14805 corp: 37/85b lim: 5 exec/s: 38 rss: 70Mb L: 4/5 MS: 1 ChangeBit- 00:07:45.313 [2024-04-27 06:50:14.988122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:14.988152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.313 #39 NEW cov: 11693 ft: 14818 corp: 38/86b lim: 5 exec/s: 39 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:07:45.313 [2024-04-27 06:50:15.028467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.028508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.313 #40 NEW cov: 11693 ft: 14823 corp: 39/87b lim: 5 exec/s: 40 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:07:45.313 [2024-04-27 06:50:15.068790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.068818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:15.068936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.068953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.313 #41 NEW cov: 11693 ft: 14877 corp: 40/89b lim: 5 exec/s: 41 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:45.313 [2024-04-27 06:50:15.109628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.109656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:15.109779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.109795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:15.109915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.109931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:15.110002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.110018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:15.110135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.110150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.313 #42 NEW cov: 11693 ft: 14907 corp: 41/94b lim: 5 exec/s: 42 rss: 70Mb L: 5/5 MS: 1 ChangeByte- 00:07:45.313 [2024-04-27 06:50:15.159544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.159571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:15.159686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.159702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:15.159819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.159835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.313 [2024-04-27 06:50:15.159967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.313 [2024-04-27 06:50:15.159982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.313 #43 NEW cov: 11693 ft: 14931 corp: 42/98b lim: 5 exec/s: 21 rss: 70Mb L: 4/5 MS: 1 InsertByte- 00:07:45.313 #43 DONE cov: 11693 ft: 14931 corp: 42/98b lim: 5 exec/s: 21 rss: 70Mb 00:07:45.313 Done 43 runs in 2 second(s) 00:07:45.573 06:50:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:45.573 06:50:15 -- ../common.sh@72 -- # (( i++ )) 00:07:45.573 06:50:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.573 06:50:15 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:45.573 06:50:15 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:45.573 06:50:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.573 06:50:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.573 06:50:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:45.573 06:50:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:45.573 06:50:15 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:45.573 06:50:15 -- nvmf/run.sh@29 -- # port=4410 00:07:45.573 06:50:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:45.573 06:50:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:45.573 06:50:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.573 06:50:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:45.573 [2024-04-27 06:50:15.342681] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:45.573 [2024-04-27 06:50:15.342768] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2622971 ] 00:07:45.573 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.833 [2024-04-27 06:50:15.526593] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.833 [2024-04-27 06:50:15.545690] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.833 [2024-04-27 06:50:15.545830] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.833 [2024-04-27 06:50:15.597223] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.833 [2024-04-27 06:50:15.613504] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:45.833 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.833 INFO: Seed: 1326631105 00:07:45.833 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:45.833 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:45.833 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:45.833 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.833 #2 INITED exec/s: 0 rss: 59Mb 00:07:45.833 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.833 This may also happen if the target rejected all inputs we tried so far 00:07:45.833 [2024-04-27 06:50:15.668738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.833 [2024-04-27 06:50:15.668766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.093 NEW_FUNC[1/662]: 0x4aa3c0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:46.093 NEW_FUNC[2/662]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.093 #3 NEW cov: 11484 ft: 11485 corp: 2/12b lim: 40 exec/s: 0 rss: 66Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:07:46.093 [2024-04-27 06:50:15.969494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8f2 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.093 [2024-04-27 06:50:15.969534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.352 NEW_FUNC[1/1]: 0xfa9860 in spdk_sock_prep_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/sock.h:284 00:07:46.352 #4 NEW cov: 11602 ft: 11978 corp: 3/24b lim: 40 exec/s: 0 rss: 67Mb L: 12/12 MS: 1 InsertByte- 00:07:46.352 [2024-04-27 06:50:16.019655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.352 [2024-04-27 06:50:16.019682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.352 [2024-04-27 06:50:16.019741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.352 [2024-04-27 06:50:16.019754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.352 #13 NEW cov: 11608 ft: 12544 corp: 4/43b lim: 40 exec/s: 0 rss: 67Mb L: 19/19 MS: 4 ShuffleBytes-ChangeBinInt-CrossOver-InsertRepeatedBytes- 00:07:46.352 [2024-04-27 06:50:16.059766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.352 [2024-04-27 06:50:16.059794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.352 [2024-04-27 06:50:16.059854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.352 [2024-04-27 06:50:16.059868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.352 #14 NEW cov: 11693 ft: 12790 corp: 5/62b lim: 40 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 CopyPart- 00:07:46.352 [2024-04-27 06:50:16.099891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.352 [2024-04-27 06:50:16.099915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.352 [2024-04-27 06:50:16.099989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.352 [2024-04-27 06:50:16.100003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.352 #15 NEW cov: 11693 ft: 12941 corp: 6/84b lim: 40 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 CrossOver- 00:07:46.352 [2024-04-27 06:50:16.140002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.352 [2024-04-27 06:50:16.140027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.352 [2024-04-27 06:50:16.140086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffd8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.352 [2024-04-27 06:50:16.140100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.352 #16 NEW cov: 11693 ft: 13068 corp: 7/103b lim: 40 exec/s: 0 rss: 67Mb L: 19/22 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:46.352 [2024-04-27 06:50:16.180105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d90000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.353 [2024-04-27 06:50:16.180131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.353 [2024-04-27 06:50:16.180205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:06ffffff cdw11:ffffd8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.353 [2024-04-27 06:50:16.180220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.353 #17 NEW cov: 11693 ft: 13161 corp: 8/122b lim: 40 exec/s: 0 rss: 67Mb L: 19/22 MS: 1 ChangeBinInt- 00:07:46.353 [2024-04-27 06:50:16.220468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8bbbb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.353 [2024-04-27 06:50:16.220494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.353 [2024-04-27 06:50:16.220554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:bbbbbbbb cdw11:bbbbbbbb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.353 [2024-04-27 06:50:16.220568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.353 [2024-04-27 06:50:16.220646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.353 [2024-04-27 06:50:16.220661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.353 [2024-04-27 06:50:16.220720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d8d80ad8 cdw11:d8d8d80a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.353 [2024-04-27 06:50:16.220734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.612 #23 NEW cov: 11693 ft: 13715 corp: 9/154b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:46.612 [2024-04-27 06:50:16.260235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:99999999 cdw11:9999990a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.612 [2024-04-27 06:50:16.260262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.612 #24 NEW cov: 11693 ft: 13875 corp: 10/162b lim: 40 exec/s: 0 rss: 69Mb L: 8/32 MS: 1 InsertRepeatedBytes- 00:07:46.612 [2024-04-27 06:50:16.290270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:99999999 cdw11:9999990a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.290295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.613 #25 NEW cov: 11693 ft: 13962 corp: 11/170b lim: 40 exec/s: 0 rss: 69Mb L: 8/32 MS: 1 CopyPart- 00:07:46.613 [2024-04-27 06:50:16.330384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.330419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.613 #26 NEW cov: 11693 ft: 14019 corp: 12/181b lim: 40 exec/s: 0 rss: 69Mb L: 11/32 MS: 1 ChangeBinInt- 00:07:46.613 [2024-04-27 06:50:16.360626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.360650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.613 [2024-04-27 06:50:16.360726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.360740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.613 #27 NEW cov: 11693 ft: 14072 corp: 13/203b lim: 40 exec/s: 0 rss: 69Mb L: 22/32 MS: 1 ChangeByte- 00:07:46.613 [2024-04-27 06:50:16.400625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8f2 cdw11:58d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.400650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.613 #28 NEW cov: 11693 ft: 14127 corp: 14/215b lim: 40 exec/s: 0 rss: 69Mb L: 12/32 MS: 1 ChangeBit- 00:07:46.613 [2024-04-27 06:50:16.440792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.440817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.613 [2024-04-27 06:50:16.440878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d821d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.440892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.613 #29 NEW cov: 11693 ft: 14177 corp: 15/238b lim: 40 exec/s: 0 rss: 69Mb L: 23/32 MS: 1 InsertByte- 00:07:46.613 [2024-04-27 06:50:16.480960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.480986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.613 [2024-04-27 06:50:16.481063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d8d8d8f2 cdw11:58d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.613 [2024-04-27 06:50:16.481077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.613 #30 NEW cov: 11693 ft: 14187 corp: 16/258b lim: 40 exec/s: 0 rss: 69Mb L: 20/32 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:46.872 [2024-04-27 06:50:16.521045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.872 [2024-04-27 06:50:16.521070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.872 [2024-04-27 06:50:16.521131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.872 [2024-04-27 06:50:16.521145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.872 #31 NEW cov: 11693 ft: 14214 corp: 17/277b lim: 40 exec/s: 0 rss: 69Mb L: 19/32 MS: 1 ChangeBinInt- 00:07:46.872 [2024-04-27 06:50:16.561028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.872 [2024-04-27 06:50:16.561053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.872 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.872 #32 NEW cov: 11716 ft: 14289 corp: 18/288b lim: 40 exec/s: 0 rss: 69Mb L: 11/32 MS: 1 EraseBytes- 00:07:46.873 [2024-04-27 06:50:16.601169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8f3 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.873 [2024-04-27 06:50:16.601194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.873 #33 NEW cov: 11716 ft: 14313 corp: 19/299b lim: 40 exec/s: 0 rss: 69Mb L: 11/32 MS: 1 ChangeByte- 00:07:46.873 [2024-04-27 06:50:16.641285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.873 [2024-04-27 06:50:16.641310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.873 #34 NEW cov: 11716 ft: 14373 corp: 20/311b lim: 40 exec/s: 34 rss: 69Mb L: 12/32 MS: 1 EraseBytes- 00:07:46.873 [2024-04-27 06:50:16.681466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8c8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.873 [2024-04-27 06:50:16.681491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.873 #35 NEW cov: 11716 ft: 14435 corp: 21/322b lim: 40 exec/s: 35 rss: 69Mb L: 11/32 MS: 1 ChangeBit- 00:07:46.873 [2024-04-27 06:50:16.711868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ad80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.873 [2024-04-27 06:50:16.711894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.873 [2024-04-27 06:50:16.711954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.873 [2024-04-27 06:50:16.711968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.873 [2024-04-27 06:50:16.712028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.873 [2024-04-27 06:50:16.712042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.873 [2024-04-27 06:50:16.712099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000000d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.873 [2024-04-27 06:50:16.712112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.873 #37 NEW cov: 11716 ft: 14442 corp: 22/356b lim: 40 exec/s: 37 rss: 70Mb L: 34/34 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:46.873 [2024-04-27 06:50:16.751586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8c8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.873 [2024-04-27 06:50:16.751611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.132 #38 NEW cov: 11716 ft: 14447 corp: 23/367b lim: 40 exec/s: 38 rss: 70Mb L: 11/34 MS: 1 CrossOver- 00:07:47.132 [2024-04-27 06:50:16.792100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ad80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.792128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.132 [2024-04-27 06:50:16.792183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.792197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.132 [2024-04-27 06:50:16.792252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:d8d8d800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.792265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.132 [2024-04-27 06:50:16.792322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.792335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.132 #39 NEW cov: 11716 ft: 14487 corp: 24/404b lim: 40 exec/s: 39 rss: 70Mb L: 37/37 MS: 1 CrossOver- 00:07:47.132 [2024-04-27 06:50:16.831973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.831999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.132 [2024-04-27 06:50:16.832055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.832069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.132 #40 NEW cov: 11716 ft: 14499 corp: 25/426b lim: 40 exec/s: 40 rss: 70Mb L: 22/37 MS: 1 ShuffleBytes- 00:07:47.132 [2024-04-27 06:50:16.862041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d87bd8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.862066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.132 [2024-04-27 06:50:16.862124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.862137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.132 #41 NEW cov: 11716 ft: 14508 corp: 26/448b lim: 40 exec/s: 41 rss: 70Mb L: 22/37 MS: 1 ChangeByte- 00:07:47.132 [2024-04-27 06:50:16.902104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.902129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.132 #42 NEW cov: 11716 ft: 14522 corp: 27/457b lim: 40 exec/s: 42 rss: 70Mb L: 9/37 MS: 1 EraseBytes- 00:07:47.132 [2024-04-27 06:50:16.932157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.932182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.132 #43 NEW cov: 11716 ft: 14533 corp: 28/469b lim: 40 exec/s: 43 rss: 70Mb L: 12/37 MS: 1 CopyPart- 00:07:47.132 [2024-04-27 06:50:16.962252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8f3 cdw11:d8d8d81f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.132 [2024-04-27 06:50:16.962277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 #44 NEW cov: 11716 ft: 14534 corp: 29/480b lim: 40 exec/s: 44 rss: 70Mb L: 11/37 MS: 1 ChangeBinInt- 00:07:47.133 [2024-04-27 06:50:17.002363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.133 [2024-04-27 06:50:17.002387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 #45 NEW cov: 11716 ft: 14598 corp: 30/491b lim: 40 exec/s: 45 rss: 70Mb L: 11/37 MS: 1 ChangeBinInt- 00:07:47.392 [2024-04-27 06:50:17.032560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8282a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.032587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.392 [2024-04-27 06:50:17.032665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.032680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.392 #46 NEW cov: 11716 ft: 14609 corp: 31/513b lim: 40 exec/s: 46 rss: 70Mb L: 22/37 MS: 1 ChangeBinInt- 00:07:47.392 [2024-04-27 06:50:17.072573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.072598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.392 #47 NEW cov: 11716 ft: 14614 corp: 32/524b lim: 40 exec/s: 47 rss: 70Mb L: 11/37 MS: 1 CopyPart- 00:07:47.392 [2024-04-27 06:50:17.102681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:99999999 cdw11:9999990a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.102706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.392 #48 NEW cov: 11716 ft: 14620 corp: 33/532b lim: 40 exec/s: 48 rss: 70Mb L: 8/37 MS: 1 CopyPart- 00:07:47.392 [2024-04-27 06:50:17.142876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ad80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.142901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.392 [2024-04-27 06:50:17.142957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.142970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.392 #49 NEW cov: 11716 ft: 14623 corp: 34/551b lim: 40 exec/s: 49 rss: 70Mb L: 19/37 MS: 1 EraseBytes- 00:07:47.392 [2024-04-27 06:50:17.182857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d85b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.182882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.392 #50 NEW cov: 11716 ft: 14633 corp: 35/560b lim: 40 exec/s: 50 rss: 70Mb L: 9/37 MS: 1 ChangeByte- 00:07:47.392 [2024-04-27 06:50:17.222983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d85bd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.223007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.392 #51 NEW cov: 11716 ft: 14649 corp: 36/569b lim: 40 exec/s: 51 rss: 70Mb L: 9/37 MS: 1 ShuffleBytes- 00:07:47.392 [2024-04-27 06:50:17.263120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8c5d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.392 [2024-04-27 06:50:17.263151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.392 #52 NEW cov: 11716 ft: 14702 corp: 37/579b lim: 40 exec/s: 52 rss: 70Mb L: 10/37 MS: 1 InsertByte- 00:07:47.651 [2024-04-27 06:50:17.303392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8e2ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.303436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.651 [2024-04-27 06:50:17.303494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffd8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.303508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.651 #53 NEW cov: 11716 ft: 14718 corp: 38/598b lim: 40 exec/s: 53 rss: 70Mb L: 19/37 MS: 1 ChangeBinInt- 00:07:47.651 [2024-04-27 06:50:17.343499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8e2ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.343523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.651 [2024-04-27 06:50:17.343582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffd8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.343595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.651 #54 NEW cov: 11716 ft: 14722 corp: 39/617b lim: 40 exec/s: 54 rss: 70Mb L: 19/37 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:47.651 [2024-04-27 06:50:17.383611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.383635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.651 [2024-04-27 06:50:17.383693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d8d90000 cdw11:06ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.383706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.651 #55 NEW cov: 11716 ft: 14731 corp: 40/640b lim: 40 exec/s: 55 rss: 70Mb L: 23/37 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:47.651 [2024-04-27 06:50:17.423693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ad80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.423718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.651 [2024-04-27 06:50:17.423777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.423791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.651 #56 NEW cov: 11716 ft: 14742 corp: 41/659b lim: 40 exec/s: 56 rss: 70Mb L: 19/37 MS: 1 ShuffleBytes- 00:07:47.651 [2024-04-27 06:50:17.463841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.463865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.651 [2024-04-27 06:50:17.463925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffff13 cdw11:000000d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.463941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.651 #57 NEW cov: 11716 ft: 14752 corp: 42/678b lim: 40 exec/s: 57 rss: 70Mb L: 19/37 MS: 1 ChangeBinInt- 00:07:47.651 [2024-04-27 06:50:17.504192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.504217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.651 [2024-04-27 06:50:17.504276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00d8bbbb cdw11:bbbbbbbb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.504289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.651 [2024-04-27 06:50:17.504345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:bbbbbbbb cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.504358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.651 [2024-04-27 06:50:17.504415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d8d8d8d8 cdw11:d8d80ad8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.504428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.651 #58 NEW cov: 11716 ft: 14780 corp: 43/714b lim: 40 exec/s: 58 rss: 70Mb L: 36/37 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:47.651 [2024-04-27 06:50:17.543936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.651 [2024-04-27 06:50:17.543961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.910 #59 NEW cov: 11716 ft: 14838 corp: 44/725b lim: 40 exec/s: 59 rss: 70Mb L: 11/37 MS: 1 ChangeBinInt- 00:07:47.910 [2024-04-27 06:50:17.584431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d83dd8d8 cdw11:d8d80100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.584455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.910 [2024-04-27 06:50:17.584514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000d8bb cdw11:bbbbbbbb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.584528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.910 [2024-04-27 06:50:17.584589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:bbbbbbbb cdw11:bbd8d8d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.584603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.910 [2024-04-27 06:50:17.584661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d80a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.584674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.910 #60 NEW cov: 11716 ft: 14843 corp: 45/762b lim: 40 exec/s: 60 rss: 70Mb L: 37/37 MS: 1 InsertByte- 00:07:47.910 [2024-04-27 06:50:17.624288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ad80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.624312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.910 [2024-04-27 06:50:17.624373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.624387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.910 #61 NEW cov: 11716 ft: 14874 corp: 46/782b lim: 40 exec/s: 61 rss: 70Mb L: 20/37 MS: 1 InsertByte- 00:07:47.910 [2024-04-27 06:50:17.664650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ad80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.664674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.910 [2024-04-27 06:50:17.664733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.664747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.910 [2024-04-27 06:50:17.664806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:001e270a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.664820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.910 [2024-04-27 06:50:17.664877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.910 [2024-04-27 06:50:17.664890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.910 #62 NEW cov: 11716 ft: 14882 corp: 47/816b lim: 40 exec/s: 31 rss: 70Mb L: 34/37 MS: 1 InsertRepeatedBytes- 00:07:47.910 #62 DONE cov: 11716 ft: 14882 corp: 47/816b lim: 40 exec/s: 31 rss: 70Mb 00:07:47.911 ###### Recommended dictionary. ###### 00:07:47.911 "\377\377\377\377\377\377\377\377" # Uses: 2 00:07:47.911 "\001\000\000\000" # Uses: 1 00:07:47.911 ###### End of recommended dictionary. ###### 00:07:47.911 Done 62 runs in 2 second(s) 00:07:47.911 06:50:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:47.911 06:50:17 -- ../common.sh@72 -- # (( i++ )) 00:07:47.911 06:50:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.169 06:50:17 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:48.169 06:50:17 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:48.169 06:50:17 -- nvmf/run.sh@24 -- # local timen=1 00:07:48.169 06:50:17 -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.169 06:50:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:48.169 06:50:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:48.169 06:50:17 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:48.169 06:50:17 -- nvmf/run.sh@29 -- # port=4411 00:07:48.169 06:50:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:48.169 06:50:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:48.169 06:50:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.169 06:50:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:48.169 [2024-04-27 06:50:17.846221] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:48.169 [2024-04-27 06:50:17.846288] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2623292 ] 00:07:48.169 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.169 [2024-04-27 06:50:18.019570] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.169 [2024-04-27 06:50:18.038883] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.169 [2024-04-27 06:50:18.039023] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.427 [2024-04-27 06:50:18.090631] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.427 [2024-04-27 06:50:18.106957] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:48.427 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.427 INFO: Seed: 3818643265 00:07:48.427 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:48.427 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:48.427 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:48.427 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.427 #2 INITED exec/s: 0 rss: 59Mb 00:07:48.428 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.428 This may also happen if the target rejected all inputs we tried so far 00:07:48.428 [2024-04-27 06:50:18.183990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.428 [2024-04-27 06:50:18.184027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.428 [2024-04-27 06:50:18.184160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.428 [2024-04-27 06:50:18.184177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.428 [2024-04-27 06:50:18.184306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.428 [2024-04-27 06:50:18.184326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.428 [2024-04-27 06:50:18.184477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.428 [2024-04-27 06:50:18.184498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.686 NEW_FUNC[1/664]: 0x4ac130 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:48.686 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.686 #6 NEW cov: 11500 ft: 11501 corp: 2/37b lim: 40 exec/s: 0 rss: 66Mb L: 36/36 MS: 4 ChangeBit-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:07:48.686 [2024-04-27 06:50:18.514534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.686 [2024-04-27 06:50:18.514589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.686 [2024-04-27 06:50:18.514749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.686 [2024-04-27 06:50:18.514778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.686 [2024-04-27 06:50:18.514936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff46 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.686 [2024-04-27 06:50:18.514963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.686 #7 NEW cov: 11614 ft: 12392 corp: 3/61b lim: 40 exec/s: 0 rss: 67Mb L: 24/36 MS: 1 EraseBytes- 00:07:48.686 [2024-04-27 06:50:18.564265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.686 [2024-04-27 06:50:18.564293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.686 [2024-04-27 06:50:18.564419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.686 [2024-04-27 06:50:18.564435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.686 [2024-04-27 06:50:18.564560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff46 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.686 [2024-04-27 06:50:18.564578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.945 #8 NEW cov: 11620 ft: 12645 corp: 4/85b lim: 40 exec/s: 0 rss: 67Mb L: 24/36 MS: 1 CrossOver- 00:07:48.945 [2024-04-27 06:50:18.604838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.604864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.604994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.605012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.605141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.605157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.605287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.605304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.945 #9 NEW cov: 11705 ft: 12969 corp: 5/121b lim: 40 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 CrossOver- 00:07:48.945 [2024-04-27 06:50:18.644974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.645001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.645125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.645144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.645276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.645292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.645414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffdffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.645432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.945 #10 NEW cov: 11705 ft: 13127 corp: 6/157b lim: 40 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 ChangeBit- 00:07:48.945 [2024-04-27 06:50:18.684420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.684458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.684592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.684612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.684747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:26ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.684765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.945 #11 NEW cov: 11705 ft: 13221 corp: 7/182b lim: 40 exec/s: 0 rss: 67Mb L: 25/36 MS: 1 InsertByte- 00:07:48.945 [2024-04-27 06:50:18.725023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.725049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.725183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.725201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.725333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.725350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.945 #12 NEW cov: 11705 ft: 13337 corp: 8/213b lim: 40 exec/s: 0 rss: 67Mb L: 31/36 MS: 1 EraseBytes- 00:07:48.945 [2024-04-27 06:50:18.764850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.764876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.765014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.765032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.765165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.765180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.945 #13 NEW cov: 11705 ft: 13386 corp: 9/239b lim: 40 exec/s: 0 rss: 67Mb L: 26/36 MS: 1 EraseBytes- 00:07:48.945 [2024-04-27 06:50:18.804907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.804934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.945 [2024-04-27 06:50:18.805066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.945 [2024-04-27 06:50:18.805089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.945 #14 NEW cov: 11705 ft: 13625 corp: 10/256b lim: 40 exec/s: 0 rss: 69Mb L: 17/36 MS: 1 EraseBytes- 00:07:49.204 [2024-04-27 06:50:18.845546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.845572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.845705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff3d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.845724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.845850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.845868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.845993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffdffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.846010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.204 #15 NEW cov: 11705 ft: 13654 corp: 11/292b lim: 40 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 ChangeByte- 00:07:49.204 [2024-04-27 06:50:18.885427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.885453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.885577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.885595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.885725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:7fffffff cdw11:ffffff46 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.885742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.204 #16 NEW cov: 11705 ft: 13668 corp: 12/316b lim: 40 exec/s: 0 rss: 69Mb L: 24/36 MS: 1 ChangeBit- 00:07:49.204 [2024-04-27 06:50:18.925803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.925831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.925955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.925970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.926093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffb2b2b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.926111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.926243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.926264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.204 #17 NEW cov: 11705 ft: 13701 corp: 13/355b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:49.204 [2024-04-27 06:50:18.965929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.965956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.966078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.966095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.966220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.966238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:18.966366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:18.966382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.204 #18 NEW cov: 11705 ft: 13717 corp: 14/391b lim: 40 exec/s: 0 rss: 69Mb L: 36/39 MS: 1 ChangeByte- 00:07:49.204 [2024-04-27 06:50:19.006080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.006107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:19.006245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff3d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.006264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:19.006398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.006416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:19.006541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffff2b cdw11:fffdffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.006563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.204 #19 NEW cov: 11705 ft: 13801 corp: 15/427b lim: 40 exec/s: 0 rss: 69Mb L: 36/39 MS: 1 ChangeByte- 00:07:49.204 [2024-04-27 06:50:19.045869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.045896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:19.046033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.046049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:19.046186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:25ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.046209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:19.046342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:5dffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.046360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.204 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.204 #20 NEW cov: 11728 ft: 13840 corp: 16/464b lim: 40 exec/s: 0 rss: 69Mb L: 37/39 MS: 1 InsertByte- 00:07:49.204 [2024-04-27 06:50:19.095942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.204 [2024-04-27 06:50:19.095971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.204 [2024-04-27 06:50:19.096106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.205 [2024-04-27 06:50:19.096126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.205 [2024-04-27 06:50:19.096265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffb2b2b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.205 [2024-04-27 06:50:19.096283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.205 [2024-04-27 06:50:19.096413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.205 [2024-04-27 06:50:19.096429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.463 #21 NEW cov: 11728 ft: 13862 corp: 17/503b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:49.463 [2024-04-27 06:50:19.146421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.463 [2024-04-27 06:50:19.146447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.463 [2024-04-27 06:50:19.146584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.146602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.146741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.146758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.146888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.146904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.464 #22 NEW cov: 11728 ft: 13869 corp: 18/537b lim: 40 exec/s: 22 rss: 69Mb L: 34/39 MS: 1 EraseBytes- 00:07:49.464 [2024-04-27 06:50:19.186348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.186375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.186511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffffeff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.186534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.186674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.186691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.464 #23 NEW cov: 11728 ft: 13954 corp: 19/568b lim: 40 exec/s: 23 rss: 69Mb L: 31/39 MS: 1 ChangeBit- 00:07:49.464 [2024-04-27 06:50:19.236482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.236513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.236645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff3a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.236665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.236801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:26ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.236818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.464 #24 NEW cov: 11728 ft: 14024 corp: 20/593b lim: 40 exec/s: 24 rss: 69Mb L: 25/39 MS: 1 ChangeByte- 00:07:49.464 [2024-04-27 06:50:19.296985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.297014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.297146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.297163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.297258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffb2b2ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.297277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.297415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.297433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.464 #25 NEW cov: 11728 ft: 14079 corp: 21/631b lim: 40 exec/s: 25 rss: 69Mb L: 38/39 MS: 1 EraseBytes- 00:07:49.464 [2024-04-27 06:50:19.346640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff5bffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.346668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.346763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.346778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.346899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.346918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.464 [2024-04-27 06:50:19.347046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffdff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.464 [2024-04-27 06:50:19.347064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.723 #26 NEW cov: 11728 ft: 14136 corp: 22/668b lim: 40 exec/s: 26 rss: 70Mb L: 37/39 MS: 1 InsertByte- 00:07:49.723 [2024-04-27 06:50:19.386777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.386806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.386928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.386947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.387072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.387088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.387213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff41 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.387231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.723 #27 NEW cov: 11728 ft: 14170 corp: 23/705b lim: 40 exec/s: 27 rss: 70Mb L: 37/39 MS: 1 InsertByte- 00:07:49.723 [2024-04-27 06:50:19.427069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.427098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.427233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff3a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.427251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.427382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:26ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.427403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.723 #28 NEW cov: 11728 ft: 14210 corp: 24/730b lim: 40 exec/s: 28 rss: 70Mb L: 25/39 MS: 1 CopyPart- 00:07:49.723 [2024-04-27 06:50:19.477528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.477557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.477690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.477708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.477829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3bffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.477850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.477982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.478000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.723 #29 NEW cov: 11728 ft: 14278 corp: 25/768b lim: 40 exec/s: 29 rss: 70Mb L: 38/39 MS: 1 InsertByte- 00:07:49.723 [2024-04-27 06:50:19.527277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.527304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.527436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.527453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.527578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.527596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.527720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.527738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.723 #30 NEW cov: 11728 ft: 14299 corp: 26/802b lim: 40 exec/s: 30 rss: 70Mb L: 34/39 MS: 1 CrossOver- 00:07:49.723 [2024-04-27 06:50:19.587835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.587865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.587995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.588013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.588145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffff0600 cdw11:004d4d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.588162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.723 [2024-04-27 06:50:19.588289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.723 [2024-04-27 06:50:19.588304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.723 #31 NEW cov: 11728 ft: 14389 corp: 27/840b lim: 40 exec/s: 31 rss: 70Mb L: 38/39 MS: 1 ChangeBinInt- 00:07:49.982 [2024-04-27 06:50:19.637301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.982 [2024-04-27 06:50:19.637329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.982 [2024-04-27 06:50:19.637463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.982 [2024-04-27 06:50:19.637485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.982 [2024-04-27 06:50:19.637626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:7fffffff cdw11:ffff7f46 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.637643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.983 #32 NEW cov: 11728 ft: 14394 corp: 28/864b lim: 40 exec/s: 32 rss: 70Mb L: 24/39 MS: 1 ChangeBit- 00:07:49.983 [2024-04-27 06:50:19.677392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.677427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.677549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fdffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.677567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.677696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.677714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.983 #33 NEW cov: 11728 ft: 14406 corp: 29/890b lim: 40 exec/s: 33 rss: 70Mb L: 26/39 MS: 1 ChangeBit- 00:07:49.983 [2024-04-27 06:50:19.727939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.727968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.728104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fbffffff cdw11:ffffff3a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.728124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.728261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:26ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.728279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.983 #34 NEW cov: 11728 ft: 14411 corp: 30/915b lim: 40 exec/s: 34 rss: 70Mb L: 25/39 MS: 1 ChangeBit- 00:07:49.983 [2024-04-27 06:50:19.777992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff5bffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.778020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.778159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.778178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.778310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.778329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.778464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffffffd cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.778489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.983 #35 NEW cov: 11728 ft: 14413 corp: 31/949b lim: 40 exec/s: 35 rss: 70Mb L: 34/39 MS: 1 EraseBytes- 00:07:49.983 [2024-04-27 06:50:19.818185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fffffffe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.818212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.818334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.818351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.818496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffdffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.818513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.983 #36 NEW cov: 11728 ft: 14423 corp: 32/977b lim: 40 exec/s: 36 rss: 70Mb L: 28/39 MS: 1 EraseBytes- 00:07:49.983 [2024-04-27 06:50:19.858027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.858055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.983 [2024-04-27 06:50:19.858202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.983 [2024-04-27 06:50:19.858220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.983 #37 NEW cov: 11728 ft: 14501 corp: 33/1000b lim: 40 exec/s: 37 rss: 70Mb L: 23/39 MS: 1 CrossOver- 00:07:50.242 [2024-04-27 06:50:19.908400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff2eff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.908427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:19.908549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.908564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:19.908698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:7fffffff cdw11:ffff7f46 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.908715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.242 #38 NEW cov: 11728 ft: 14536 corp: 34/1024b lim: 40 exec/s: 38 rss: 70Mb L: 24/39 MS: 1 ChangeByte- 00:07:50.242 [2024-04-27 06:50:19.947852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.947879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:19.948014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.948032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.242 #39 NEW cov: 11728 ft: 14545 corp: 35/1044b lim: 40 exec/s: 39 rss: 70Mb L: 20/39 MS: 1 CrossOver- 00:07:50.242 [2024-04-27 06:50:19.988809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.988837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:19.988963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff3d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.988981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:19.989103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.989118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:19.989245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff2bff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:19.989263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.242 #40 NEW cov: 11728 ft: 14553 corp: 36/1083b lim: 40 exec/s: 40 rss: 70Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:50.242 [2024-04-27 06:50:20.038518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:20.038546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:20.038671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:20.038689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:20.038815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:7fffffff cdw11:ffff8046 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:20.038834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.242 #41 NEW cov: 11728 ft: 14554 corp: 37/1107b lim: 40 exec/s: 41 rss: 70Mb L: 24/39 MS: 1 ChangeBinInt- 00:07:50.242 [2024-04-27 06:50:20.089262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:20.089292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:20.089442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:20.089461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:20.089583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.242 [2024-04-27 06:50:20.089600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.242 [2024-04-27 06:50:20.089742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fdffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.243 [2024-04-27 06:50:20.089760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.243 #42 NEW cov: 11728 ft: 14583 corp: 38/1143b lim: 40 exec/s: 42 rss: 70Mb L: 36/39 MS: 1 ChangeBit- 00:07:50.502 [2024-04-27 06:50:20.139250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.502 [2024-04-27 06:50:20.139281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.502 [2024-04-27 06:50:20.139429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fdffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.502 [2024-04-27 06:50:20.139446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.502 [2024-04-27 06:50:20.139578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff1affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.502 [2024-04-27 06:50:20.139596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.502 #43 NEW cov: 11728 ft: 14613 corp: 39/1169b lim: 40 exec/s: 21 rss: 70Mb L: 26/39 MS: 1 ChangeBinInt- 00:07:50.502 #43 DONE cov: 11728 ft: 14613 corp: 39/1169b lim: 40 exec/s: 21 rss: 70Mb 00:07:50.502 Done 43 runs in 2 second(s) 00:07:50.502 06:50:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:50.502 06:50:20 -- ../common.sh@72 -- # (( i++ )) 00:07:50.502 06:50:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.502 06:50:20 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:50.502 06:50:20 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:50.502 06:50:20 -- nvmf/run.sh@24 -- # local timen=1 00:07:50.502 06:50:20 -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.502 06:50:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:50.502 06:50:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:50.502 06:50:20 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:50.502 06:50:20 -- nvmf/run.sh@29 -- # port=4412 00:07:50.502 06:50:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:50.502 06:50:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:50.502 06:50:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.502 06:50:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:50.502 [2024-04-27 06:50:20.312091] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:50.502 [2024-04-27 06:50:20.312145] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2623806 ] 00:07:50.502 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.761 [2024-04-27 06:50:20.487966] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.761 [2024-04-27 06:50:20.507877] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.761 [2024-04-27 06:50:20.508017] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.761 [2024-04-27 06:50:20.559595] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.761 [2024-04-27 06:50:20.575893] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:50.761 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.761 INFO: Seed: 1991669265 00:07:50.761 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:50.761 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:50.761 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:50.761 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.761 #2 INITED exec/s: 0 rss: 60Mb 00:07:50.761 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.761 This may also happen if the target rejected all inputs we tried so far 00:07:50.761 [2024-04-27 06:50:20.625137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.761 [2024-04-27 06:50:20.625165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.761 [2024-04-27 06:50:20.625224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.761 [2024-04-27 06:50:20.625238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.761 [2024-04-27 06:50:20.625296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.761 [2024-04-27 06:50:20.625309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.020 NEW_FUNC[1/663]: 0x4adea0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:51.020 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.020 #13 NEW cov: 11479 ft: 11492 corp: 2/25b lim: 40 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:07:51.280 [2024-04-27 06:50:20.935992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:20.936025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:20.936086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:20.936101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:20.936160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:20.936174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.280 NEW_FUNC[1/1]: 0x1977630 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:07:51.280 #14 NEW cov: 11612 ft: 12082 corp: 3/49b lim: 40 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ChangeBit- 00:07:51.280 [2024-04-27 06:50:20.986043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:20.986069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:20.986142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:18000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:20.986157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:20.986214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:20.986228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.280 #15 NEW cov: 11618 ft: 12297 corp: 4/73b lim: 40 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ChangeBinInt- 00:07:51.280 [2024-04-27 06:50:21.026164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.026192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:21.026267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.026282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:21.026348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.026361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.280 #16 NEW cov: 11703 ft: 12582 corp: 5/98b lim: 40 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 InsertByte- 00:07:51.280 [2024-04-27 06:50:21.066249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.066275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:21.066335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:18000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.066349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:21.066411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.066425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.280 #17 NEW cov: 11703 ft: 12645 corp: 6/122b lim: 40 exec/s: 0 rss: 68Mb L: 24/25 MS: 1 ChangeByte- 00:07:51.280 [2024-04-27 06:50:21.106385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.106416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:21.106492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.106506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:21.106562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.106576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.280 #18 NEW cov: 11703 ft: 12744 corp: 7/147b lim: 40 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 ChangeByte- 00:07:51.280 [2024-04-27 06:50:21.146498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.146524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:21.146586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.146602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.280 [2024-04-27 06:50:21.146661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-27 06:50:21.146674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.280 #19 NEW cov: 11703 ft: 12871 corp: 8/171b lim: 40 exec/s: 0 rss: 69Mb L: 24/25 MS: 1 ShuffleBytes- 00:07:51.541 [2024-04-27 06:50:21.186624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.186650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.186713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.186727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.186786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.186800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.541 #20 NEW cov: 11703 ft: 12993 corp: 9/195b lim: 40 exec/s: 0 rss: 69Mb L: 24/25 MS: 1 ShuffleBytes- 00:07:51.541 [2024-04-27 06:50:21.226793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.226818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.226878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:18000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.226892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.226950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0100000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.226964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.541 #21 NEW cov: 11703 ft: 13068 corp: 10/219b lim: 40 exec/s: 0 rss: 69Mb L: 24/25 MS: 1 ChangeBit- 00:07:51.541 [2024-04-27 06:50:21.266827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.266853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.266913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.266927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.266983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.266996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.541 #22 NEW cov: 11703 ft: 13163 corp: 11/244b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 ChangeByte- 00:07:51.541 [2024-04-27 06:50:21.306912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.306941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.307001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.307015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.307074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.307087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.541 #23 NEW cov: 11703 ft: 13180 corp: 12/268b lim: 40 exec/s: 0 rss: 69Mb L: 24/25 MS: 1 ShuffleBytes- 00:07:51.541 [2024-04-27 06:50:21.347073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000018 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.347099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.347156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.347170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.347226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.347239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.541 #24 NEW cov: 11703 ft: 13220 corp: 13/292b lim: 40 exec/s: 0 rss: 69Mb L: 24/25 MS: 1 CopyPart- 00:07:51.541 [2024-04-27 06:50:21.387198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.387222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.387282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.387295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.387353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.387366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.541 #25 NEW cov: 11703 ft: 13294 corp: 14/317b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 ChangeByte- 00:07:51.541 [2024-04-27 06:50:21.427302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.427327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.427386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.427404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.541 [2024-04-27 06:50:21.427461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:40000000 cdw11:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.541 [2024-04-27 06:50:21.427477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.801 #26 NEW cov: 11703 ft: 13313 corp: 15/342b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 ChangeBit- 00:07:51.801 [2024-04-27 06:50:21.467417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002510 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.467441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.467499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:18000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.467512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.467583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0100000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.467597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.801 #27 NEW cov: 11703 ft: 13338 corp: 16/366b lim: 40 exec/s: 0 rss: 69Mb L: 24/25 MS: 1 ChangeBit- 00:07:51.801 [2024-04-27 06:50:21.507542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f0000ff cdw11:ffff0100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.507567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.507627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.507641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.507699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00003b00 cdw11:40000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.507713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.801 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.801 #28 NEW cov: 11726 ft: 13371 corp: 17/395b lim: 40 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CMP- DE: "\377\377\377\001"- 00:07:51.801 [2024-04-27 06:50:21.547788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.547814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.547871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.547884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.547944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00777777 cdw11:77777777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.547958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.548014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:77000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.548027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.801 #29 NEW cov: 11726 ft: 13686 corp: 18/427b lim: 40 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:51.801 [2024-04-27 06:50:21.587743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.587769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.587830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.587844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.587904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.587918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.801 #30 NEW cov: 11726 ft: 13721 corp: 19/451b lim: 40 exec/s: 30 rss: 70Mb L: 24/32 MS: 1 ChangeBinInt- 00:07:51.801 [2024-04-27 06:50:21.627882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.627908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.627969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.627984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.628041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.628054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.801 #31 NEW cov: 11726 ft: 13749 corp: 20/475b lim: 40 exec/s: 31 rss: 70Mb L: 24/32 MS: 1 ShuffleBytes- 00:07:51.801 [2024-04-27 06:50:21.667989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.668014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.668073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.801 [2024-04-27 06:50:21.668087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.801 [2024-04-27 06:50:21.668146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00007777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.802 [2024-04-27 06:50:21.668160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.802 #32 NEW cov: 11726 ft: 13802 corp: 21/505b lim: 40 exec/s: 32 rss: 70Mb L: 30/32 MS: 1 CrossOver- 00:07:52.061 [2024-04-27 06:50:21.707778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.707804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.061 #39 NEW cov: 11726 ft: 14600 corp: 22/514b lim: 40 exec/s: 39 rss: 70Mb L: 9/32 MS: 2 ShuffleBytes-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:52.061 [2024-04-27 06:50:21.748236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f000a00 cdw11:ffffff01 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.748260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.061 [2024-04-27 06:50:21.748318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.748332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.061 [2024-04-27 06:50:21.748410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000003b cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.748424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.061 #40 NEW cov: 11726 ft: 14609 corp: 23/544b lim: 40 exec/s: 40 rss: 70Mb L: 30/32 MS: 1 CrossOver- 00:07:52.061 [2024-04-27 06:50:21.788332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7ef7ffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.788357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.061 [2024-04-27 06:50:21.788424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.788438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.061 [2024-04-27 06:50:21.788494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.788508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.061 #41 NEW cov: 11726 ft: 14641 corp: 24/569b lim: 40 exec/s: 41 rss: 70Mb L: 25/32 MS: 1 ChangeBinInt- 00:07:52.061 [2024-04-27 06:50:21.828449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.828474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.061 [2024-04-27 06:50:21.828534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:18000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.828547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.061 [2024-04-27 06:50:21.828603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.828617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.061 #42 NEW cov: 11726 ft: 14695 corp: 25/593b lim: 40 exec/s: 42 rss: 70Mb L: 24/32 MS: 1 ChangeBit- 00:07:52.061 [2024-04-27 06:50:21.868568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002510 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.868593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.061 [2024-04-27 06:50:21.868654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:18000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.868668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.061 [2024-04-27 06:50:21.868729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.868742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.061 #43 NEW cov: 11726 ft: 14711 corp: 26/617b lim: 40 exec/s: 43 rss: 70Mb L: 24/32 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:52.061 [2024-04-27 06:50:21.908691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f000a00 cdw11:ffffff01 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.061 [2024-04-27 06:50:21.908715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.062 [2024-04-27 06:50:21.908777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.062 [2024-04-27 06:50:21.908790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.062 [2024-04-27 06:50:21.908850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000003b cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.062 [2024-04-27 06:50:21.908863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.062 #44 NEW cov: 11726 ft: 14726 corp: 27/647b lim: 40 exec/s: 44 rss: 70Mb L: 30/32 MS: 1 ShuffleBytes- 00:07:52.062 [2024-04-27 06:50:21.948797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.062 [2024-04-27 06:50:21.948822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.062 [2024-04-27 06:50:21.948884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.062 [2024-04-27 06:50:21.948898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.062 [2024-04-27 06:50:21.948956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:56565600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.062 [2024-04-27 06:50:21.948970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.322 #45 NEW cov: 11726 ft: 14746 corp: 28/675b lim: 40 exec/s: 45 rss: 70Mb L: 28/32 MS: 1 InsertRepeatedBytes- 00:07:52.322 [2024-04-27 06:50:21.988916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:21.988941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.322 [2024-04-27 06:50:21.988999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:21.989013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.322 [2024-04-27 06:50:21.989073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0200000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:21.989086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.322 #46 NEW cov: 11726 ft: 14753 corp: 29/699b lim: 40 exec/s: 46 rss: 70Mb L: 24/32 MS: 1 ChangeBit- 00:07:52.322 [2024-04-27 06:50:22.018689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.018716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.322 #47 NEW cov: 11726 ft: 14847 corp: 30/712b lim: 40 exec/s: 47 rss: 71Mb L: 13/32 MS: 1 EraseBytes- 00:07:52.322 [2024-04-27 06:50:22.059094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:ffffff01 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.059118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.322 [2024-04-27 06:50:22.059175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:18000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.059188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.322 [2024-04-27 06:50:22.059259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.059273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.322 #48 NEW cov: 11726 ft: 14882 corp: 31/736b lim: 40 exec/s: 48 rss: 71Mb L: 24/32 MS: 1 PersAutoDict- DE: "\377\377\377\001"- 00:07:52.322 [2024-04-27 06:50:22.098943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.098968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.322 #49 NEW cov: 11726 ft: 14906 corp: 32/751b lim: 40 exec/s: 49 rss: 71Mb L: 15/32 MS: 1 EraseBytes- 00:07:52.322 [2024-04-27 06:50:22.139347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.139372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.322 [2024-04-27 06:50:22.139432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.139447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.322 [2024-04-27 06:50:22.139504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.139518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.322 #50 NEW cov: 11726 ft: 14915 corp: 33/776b lim: 40 exec/s: 50 rss: 71Mb L: 25/32 MS: 1 ShuffleBytes- 00:07:52.322 [2024-04-27 06:50:22.179416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7ef7ffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.179442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.322 [2024-04-27 06:50:22.179500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.179513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.322 [2024-04-27 06:50:22.179568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.322 [2024-04-27 06:50:22.179581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.322 #51 NEW cov: 11726 ft: 14927 corp: 34/805b lim: 40 exec/s: 51 rss: 71Mb L: 29/32 MS: 1 InsertRepeatedBytes- 00:07:52.582 [2024-04-27 06:50:22.219743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.219768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.219844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.219858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.219917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.219930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.219987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00007777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.220001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.582 #52 NEW cov: 11726 ft: 14975 corp: 35/843b lim: 40 exec/s: 52 rss: 71Mb L: 38/38 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:52.582 [2024-04-27 06:50:22.259860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.259885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.259959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.259974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.260034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00777777 cdw11:77777777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.260048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.260106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:77d1d1d1 cdw11:d1d10000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.260119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.582 #53 NEW cov: 11726 ft: 14983 corp: 36/880b lim: 40 exec/s: 53 rss: 71Mb L: 37/38 MS: 1 InsertRepeatedBytes- 00:07:52.582 [2024-04-27 06:50:22.299819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.299844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.299905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.299918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.299993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.300007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.582 #54 NEW cov: 11726 ft: 14990 corp: 37/906b lim: 40 exec/s: 54 rss: 71Mb L: 26/38 MS: 1 CopyPart- 00:07:52.582 [2024-04-27 06:50:22.339983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000018 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.340008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.340066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.340080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.340136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000020 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.340150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.582 #55 NEW cov: 11726 ft: 14995 corp: 38/930b lim: 40 exec/s: 55 rss: 71Mb L: 24/38 MS: 1 ChangeBit- 00:07:52.582 [2024-04-27 06:50:22.380023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.380049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.380110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.380124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.380182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.380195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.582 #56 NEW cov: 11726 ft: 15004 corp: 39/961b lim: 40 exec/s: 56 rss: 71Mb L: 31/38 MS: 1 CrossOver- 00:07:52.582 [2024-04-27 06:50:22.419806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e0a00ff cdw11:ffff0100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.419832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.582 #59 NEW cov: 11726 ft: 15012 corp: 40/971b lim: 40 exec/s: 59 rss: 71Mb L: 10/38 MS: 3 CrossOver-CrossOver-PersAutoDict- DE: "\377\377\377\001"- 00:07:52.582 [2024-04-27 06:50:22.460391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.460421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.460497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.460511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.460570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.460584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.582 [2024-04-27 06:50:22.460640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00007777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.582 [2024-04-27 06:50:22.460660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.842 #60 NEW cov: 11726 ft: 15030 corp: 41/1009b lim: 40 exec/s: 60 rss: 72Mb L: 38/38 MS: 1 CrossOver- 00:07:52.842 [2024-04-27 06:50:22.500369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.500399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.843 [2024-04-27 06:50:22.500460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.500474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.843 [2024-04-27 06:50:22.500530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:40000027 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.500544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.843 #61 NEW cov: 11726 ft: 15038 corp: 42/1034b lim: 40 exec/s: 61 rss: 72Mb L: 25/38 MS: 1 ShuffleBytes- 00:07:52.843 [2024-04-27 06:50:22.540336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.540362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.843 [2024-04-27 06:50:22.540426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.540440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.843 #62 NEW cov: 11726 ft: 15229 corp: 43/1050b lim: 40 exec/s: 62 rss: 72Mb L: 16/38 MS: 1 EraseBytes- 00:07:52.843 [2024-04-27 06:50:22.580637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.580662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.843 [2024-04-27 06:50:22.580723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.580737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.843 [2024-04-27 06:50:22.580811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.580825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.843 #63 NEW cov: 11726 ft: 15233 corp: 44/1080b lim: 40 exec/s: 63 rss: 72Mb L: 30/38 MS: 1 CopyPart- 00:07:52.843 [2024-04-27 06:50:22.620709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f000000 cdw11:ffffff01 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.620734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.843 [2024-04-27 06:50:22.620794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00003b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.620807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.843 [2024-04-27 06:50:22.620870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:40000027 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.843 [2024-04-27 06:50:22.620883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.843 #64 pulse cov: 11726 ft: 15238 corp: 44/1080b lim: 40 exec/s: 32 rss: 72Mb 00:07:52.843 #64 NEW cov: 11726 ft: 15238 corp: 45/1105b lim: 40 exec/s: 32 rss: 72Mb L: 25/38 MS: 1 PersAutoDict- DE: "\377\377\377\001"- 00:07:52.843 #64 DONE cov: 11726 ft: 15238 corp: 45/1105b lim: 40 exec/s: 32 rss: 72Mb 00:07:52.843 ###### Recommended dictionary. ###### 00:07:52.843 "\377\377\377\001" # Uses: 3 00:07:52.843 "\000\000\000\000\000\000\000\000" # Uses: 2 00:07:52.843 ###### End of recommended dictionary. ###### 00:07:52.843 Done 64 runs in 2 second(s) 00:07:53.102 06:50:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:53.102 06:50:22 -- ../common.sh@72 -- # (( i++ )) 00:07:53.102 06:50:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.102 06:50:22 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:53.102 06:50:22 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:53.102 06:50:22 -- nvmf/run.sh@24 -- # local timen=1 00:07:53.102 06:50:22 -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.102 06:50:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:53.102 06:50:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:53.102 06:50:22 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:53.102 06:50:22 -- nvmf/run.sh@29 -- # port=4413 00:07:53.102 06:50:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:53.102 06:50:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:53.102 06:50:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.102 06:50:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:53.102 [2024-04-27 06:50:22.801700] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:53.102 [2024-04-27 06:50:22.801772] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624284 ] 00:07:53.102 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.102 [2024-04-27 06:50:22.982855] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.361 [2024-04-27 06:50:23.002248] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.361 [2024-04-27 06:50:23.002374] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.361 [2024-04-27 06:50:23.053744] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.361 [2024-04-27 06:50:23.070067] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:53.361 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.361 INFO: Seed: 191709472 00:07:53.361 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:53.361 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:53.361 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:53.361 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.361 #2 INITED exec/s: 0 rss: 59Mb 00:07:53.361 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.361 This may also happen if the target rejected all inputs we tried so far 00:07:53.361 [2024-04-27 06:50:23.115569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.361 [2024-04-27 06:50:23.115600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.361 [2024-04-27 06:50:23.115656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.361 [2024-04-27 06:50:23.115669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.361 [2024-04-27 06:50:23.115724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.361 [2024-04-27 06:50:23.115737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.361 [2024-04-27 06:50:23.115792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.361 [2024-04-27 06:50:23.115805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.620 NEW_FUNC[1/663]: 0x4afa60 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:53.620 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.620 #5 NEW cov: 11487 ft: 11488 corp: 2/39b lim: 40 exec/s: 0 rss: 67Mb L: 38/38 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:53.620 [2024-04-27 06:50:23.436349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.620 [2024-04-27 06:50:23.436382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.620 [2024-04-27 06:50:23.436441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.620 [2024-04-27 06:50:23.436457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.620 [2024-04-27 06:50:23.436514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.620 [2024-04-27 06:50:23.436527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.620 [2024-04-27 06:50:23.436581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.620 [2024-04-27 06:50:23.436596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.620 #11 NEW cov: 11600 ft: 11990 corp: 3/77b lim: 40 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 CrossOver- 00:07:53.620 [2024-04-27 06:50:23.476370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.620 [2024-04-27 06:50:23.476400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.620 [2024-04-27 06:50:23.476454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.620 [2024-04-27 06:50:23.476467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.620 [2024-04-27 06:50:23.476520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.620 [2024-04-27 06:50:23.476533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.620 [2024-04-27 06:50:23.476588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.620 [2024-04-27 06:50:23.476601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.620 #12 NEW cov: 11606 ft: 12277 corp: 4/115b lim: 40 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 ChangeBit- 00:07:53.879 [2024-04-27 06:50:23.516208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a414141 cdw11:41414141 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.516235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.516307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41414141 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.516321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.880 #15 NEW cov: 11691 ft: 13031 corp: 5/134b lim: 40 exec/s: 0 rss: 67Mb L: 19/38 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:53.880 [2024-04-27 06:50:23.556624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.556649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.556706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.556720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.556774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.556787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.556839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.556852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.880 #16 NEW cov: 11691 ft: 13147 corp: 6/172b lim: 40 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 ChangeBit- 00:07:53.880 [2024-04-27 06:50:23.596720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.596744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.596802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.596816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.596872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.596885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.596942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.596958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.880 #19 NEW cov: 11691 ft: 13199 corp: 7/209b lim: 40 exec/s: 0 rss: 67Mb L: 37/38 MS: 3 InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:53.880 [2024-04-27 06:50:23.626795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.626819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.626875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.626889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.626944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.626957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.627011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.627024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.880 #20 NEW cov: 11691 ft: 13235 corp: 8/247b lim: 40 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 CopyPart- 00:07:53.880 [2024-04-27 06:50:23.666869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.666893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.666935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.666948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.667005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.667018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.667074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.667087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.880 #21 NEW cov: 11691 ft: 13329 corp: 9/285b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ChangeBit- 00:07:53.880 [2024-04-27 06:50:23.707051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.707076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.707130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.707143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.707214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.707231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.707284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.707297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.880 #22 NEW cov: 11691 ft: 13409 corp: 10/323b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ChangeBinInt- 00:07:53.880 [2024-04-27 06:50:23.747099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7a02ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.747124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.747180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fffeffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.747193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.747249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.747262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.880 [2024-04-27 06:50:23.747317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.880 [2024-04-27 06:50:23.747330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.880 #23 NEW cov: 11691 ft: 13441 corp: 11/362b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertByte- 00:07:54.140 [2024-04-27 06:50:23.776868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:8a00ffff cdw11:8a00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.776893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.140 #27 NEW cov: 11691 ft: 13813 corp: 12/375b lim: 40 exec/s: 0 rss: 68Mb L: 13/39 MS: 4 ChangeBit-CrossOver-ChangeBit-CopyPart- 00:07:54.140 [2024-04-27 06:50:23.817331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.817356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.817416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.817430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.817484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff26ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.817497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.817552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.817565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.140 #28 NEW cov: 11691 ft: 13860 corp: 13/413b lim: 40 exec/s: 0 rss: 68Mb L: 38/39 MS: 1 ChangeBinInt- 00:07:54.140 [2024-04-27 06:50:23.857429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.857454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.857510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.857523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.857581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.857594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.857653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff7eff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.857667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.140 #29 NEW cov: 11691 ft: 13874 corp: 14/451b lim: 40 exec/s: 0 rss: 68Mb L: 38/39 MS: 1 ChangeByte- 00:07:54.140 [2024-04-27 06:50:23.897457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.897483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.897539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.897552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.897611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.897624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.140 #30 NEW cov: 11691 ft: 14063 corp: 15/482b lim: 40 exec/s: 0 rss: 68Mb L: 31/39 MS: 1 EraseBytes- 00:07:54.140 [2024-04-27 06:50:23.937687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.937712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.937785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fefffffe cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.937799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.937855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff26ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.937869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.937924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.937938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.140 #31 NEW cov: 11691 ft: 14112 corp: 16/520b lim: 40 exec/s: 0 rss: 68Mb L: 38/39 MS: 1 ChangeBit- 00:07:54.140 [2024-04-27 06:50:23.977543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.977569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:23.977626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:23.977639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.140 #32 NEW cov: 11691 ft: 14162 corp: 17/541b lim: 40 exec/s: 0 rss: 68Mb L: 21/39 MS: 1 EraseBytes- 00:07:54.140 [2024-04-27 06:50:24.017887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:9602ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:24.017912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:24.017986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:24.018000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:24.018055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:24.018069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.140 [2024-04-27 06:50:24.018122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.140 [2024-04-27 06:50:24.018135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.400 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.400 #35 NEW cov: 11714 ft: 14200 corp: 18/580b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 3 ChangeByte-ChangeBinInt-CrossOver- 00:07:54.400 [2024-04-27 06:50:24.057784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:8a00ffff cdw11:8a00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.057809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.057866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff8a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.057879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.400 #36 NEW cov: 11714 ft: 14219 corp: 19/597b lim: 40 exec/s: 0 rss: 69Mb L: 17/39 MS: 1 CopyPart- 00:07:54.400 [2024-04-27 06:50:24.097894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.097919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.097978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.097991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.400 #37 NEW cov: 11714 ft: 14233 corp: 20/619b lim: 40 exec/s: 37 rss: 69Mb L: 22/39 MS: 1 EraseBytes- 00:07:54.400 [2024-04-27 06:50:24.138256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.138281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.138353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff00 cdw11:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.138366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.138427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.138440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.138495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.138508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.400 #38 NEW cov: 11714 ft: 14244 corp: 21/657b lim: 40 exec/s: 38 rss: 69Mb L: 38/39 MS: 1 ChangeBinInt- 00:07:54.400 [2024-04-27 06:50:24.178322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.178347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.178408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ff23ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.178421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.178477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.178490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.178545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.178558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.400 #39 NEW cov: 11714 ft: 14311 corp: 22/696b lim: 40 exec/s: 39 rss: 69Mb L: 39/39 MS: 1 InsertByte- 00:07:54.400 [2024-04-27 06:50:24.218471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdff26 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.218497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.218569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.218583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.218638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff26ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.218651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.218709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.218723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.400 #40 NEW cov: 11714 ft: 14329 corp: 23/734b lim: 40 exec/s: 40 rss: 69Mb L: 38/39 MS: 1 ChangeBinInt- 00:07:54.400 [2024-04-27 06:50:24.258467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fffdff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.258492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.258549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.258562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.400 [2024-04-27 06:50:24.258635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.400 [2024-04-27 06:50:24.258649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.400 #41 NEW cov: 11714 ft: 14342 corp: 24/765b lim: 40 exec/s: 41 rss: 69Mb L: 31/39 MS: 1 ChangeBit- 00:07:54.659 [2024-04-27 06:50:24.298383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:8a00ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.659 [2024-04-27 06:50:24.298413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.660 #42 NEW cov: 11714 ft: 14384 corp: 25/778b lim: 40 exec/s: 42 rss: 69Mb L: 13/39 MS: 1 CrossOver- 00:07:54.660 [2024-04-27 06:50:24.338851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdff26 cdw11:00000026 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.338876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.338931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.338945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.339002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff26ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.339016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.339070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.339083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.660 #43 NEW cov: 11714 ft: 14401 corp: 26/816b lim: 40 exec/s: 43 rss: 69Mb L: 38/39 MS: 1 ChangeByte- 00:07:54.660 [2024-04-27 06:50:24.378827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.378852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.378910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.378927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.378981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.378994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.660 #44 NEW cov: 11714 ft: 14414 corp: 27/846b lim: 40 exec/s: 44 rss: 69Mb L: 30/39 MS: 1 CopyPart- 00:07:54.660 [2024-04-27 06:50:24.419082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.419108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.419181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.419195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.419252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.419266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.419320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.419334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.660 #45 NEW cov: 11714 ft: 14432 corp: 28/884b lim: 40 exec/s: 45 rss: 69Mb L: 38/39 MS: 1 ShuffleBytes- 00:07:54.660 [2024-04-27 06:50:24.449117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.449142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.449217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffff02 cdw11:fdff2600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.449231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.449286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.449300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.449355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.449368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.660 #46 NEW cov: 11714 ft: 14475 corp: 29/922b lim: 40 exec/s: 46 rss: 69Mb L: 38/39 MS: 1 CrossOver- 00:07:54.660 [2024-04-27 06:50:24.489231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.489256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.489316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffff02 cdw11:fdff2600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.489330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.489386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.489403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.489474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff7fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.489487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.660 #47 NEW cov: 11714 ft: 14482 corp: 30/960b lim: 40 exec/s: 47 rss: 69Mb L: 38/39 MS: 1 ChangeBit- 00:07:54.660 [2024-04-27 06:50:24.529164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:8a1600ff cdw11:ff8a00ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.529189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.660 [2024-04-27 06:50:24.529247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff8a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.660 [2024-04-27 06:50:24.529260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.919 #48 NEW cov: 11714 ft: 14492 corp: 31/978b lim: 40 exec/s: 48 rss: 69Mb L: 18/39 MS: 1 InsertByte- 00:07:54.919 [2024-04-27 06:50:24.569145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7a0202ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.569170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.919 #51 NEW cov: 11714 ft: 14535 corp: 32/990b lim: 40 exec/s: 51 rss: 69Mb L: 12/39 MS: 3 CrossOver-InsertByte-CrossOver- 00:07:54.919 [2024-04-27 06:50:24.609636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.609661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.609715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.609729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.609781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.609794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.609849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff7eff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.609862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.919 #52 NEW cov: 11714 ft: 14544 corp: 33/1028b lim: 40 exec/s: 52 rss: 69Mb L: 38/39 MS: 1 CopyPart- 00:07:54.919 [2024-04-27 06:50:24.649715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7a02ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.649742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.649799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fffeffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.649812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.649867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.649880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.649934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff0000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.649947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.919 #53 NEW cov: 11714 ft: 14588 corp: 34/1067b lim: 40 exec/s: 53 rss: 69Mb L: 39/39 MS: 1 CMP- DE: "\000\000"- 00:07:54.919 [2024-04-27 06:50:24.689848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.689873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.689929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.689942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.689999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.690012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.690068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.690081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.919 #54 NEW cov: 11714 ft: 14598 corp: 35/1103b lim: 40 exec/s: 54 rss: 69Mb L: 36/39 MS: 1 EraseBytes- 00:07:54.919 [2024-04-27 06:50:24.729955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.729980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.730034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.730047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.730103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff80df SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.730116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.730170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.730186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.919 #55 NEW cov: 11714 ft: 14612 corp: 36/1142b lim: 40 exec/s: 55 rss: 69Mb L: 39/39 MS: 1 InsertByte- 00:07:54.919 [2024-04-27 06:50:24.760040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7a02ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.760066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.760120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fffeffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.760133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.919 [2024-04-27 06:50:24.760186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffca SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.919 [2024-04-27 06:50:24.760198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.920 [2024-04-27 06:50:24.760247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff0000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.920 [2024-04-27 06:50:24.760260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.920 #56 NEW cov: 11714 ft: 14623 corp: 37/1181b lim: 40 exec/s: 56 rss: 70Mb L: 39/39 MS: 1 ChangeByte- 00:07:54.920 [2024-04-27 06:50:24.799900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:750a4141 cdw11:41414141 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.920 [2024-04-27 06:50:24.799924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.920 [2024-04-27 06:50:24.799977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41414141 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.920 [2024-04-27 06:50:24.799990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.179 #57 NEW cov: 11714 ft: 14625 corp: 38/1201b lim: 40 exec/s: 57 rss: 70Mb L: 20/39 MS: 1 InsertByte- 00:07:55.179 [2024-04-27 06:50:24.840270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffff00 cdw11:020000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.840294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.840349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.840363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.840418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.840431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.840483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.840496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.179 #58 NEW cov: 11714 ft: 14643 corp: 39/1239b lim: 40 exec/s: 58 rss: 70Mb L: 38/39 MS: 1 CMP- DE: "\000\002\000\000"- 00:07:55.179 [2024-04-27 06:50:24.880426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.880450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.880519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.880534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.880589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.880602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.880653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffefffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.880666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.179 #59 NEW cov: 11714 ft: 14653 corp: 40/1277b lim: 40 exec/s: 59 rss: 70Mb L: 38/39 MS: 1 ChangeBit- 00:07:55.179 [2024-04-27 06:50:24.920546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.920570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.920619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.920632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.920686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.920699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.920753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.920767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.179 #60 NEW cov: 11714 ft: 14681 corp: 41/1316b lim: 40 exec/s: 60 rss: 70Mb L: 39/39 MS: 1 InsertByte- 00:07:55.179 [2024-04-27 06:50:24.960612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.960637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.960691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.960705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.960759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.960772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:24.960828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:24.960842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.179 #61 NEW cov: 11714 ft: 14688 corp: 42/1354b lim: 40 exec/s: 61 rss: 70Mb L: 38/39 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:55.179 [2024-04-27 06:50:25.000774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:9603ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:25.000800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:25.000871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:25.000885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:25.000918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:25.000931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:25.000988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:25.001001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.179 #62 NEW cov: 11714 ft: 14698 corp: 43/1393b lim: 40 exec/s: 62 rss: 70Mb L: 39/39 MS: 1 ChangeBit- 00:07:55.179 [2024-04-27 06:50:25.040896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:25.040920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:25.040990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:25.041004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:25.041041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:feffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:25.041055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.179 [2024-04-27 06:50:25.041113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff26ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.179 [2024-04-27 06:50:25.041126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.179 #63 NEW cov: 11714 ft: 14711 corp: 44/1430b lim: 40 exec/s: 63 rss: 70Mb L: 37/39 MS: 1 CrossOver- 00:07:55.439 [2024-04-27 06:50:25.080971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02fdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.439 [2024-04-27 06:50:25.080997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.439 [2024-04-27 06:50:25.081053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffe0ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.439 [2024-04-27 06:50:25.081066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.439 [2024-04-27 06:50:25.081140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.439 [2024-04-27 06:50:25.081152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.439 [2024-04-27 06:50:25.081209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.439 [2024-04-27 06:50:25.081223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.439 #64 NEW cov: 11714 ft: 14715 corp: 45/1468b lim: 40 exec/s: 64 rss: 70Mb L: 38/39 MS: 1 ChangeByte- 00:07:55.439 [2024-04-27 06:50:25.121080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.439 [2024-04-27 06:50:25.121104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.439 [2024-04-27 06:50:25.121158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.439 [2024-04-27 06:50:25.121172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.439 [2024-04-27 06:50:25.121227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffdfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.439 [2024-04-27 06:50:25.121240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.439 [2024-04-27 06:50:25.121293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.439 [2024-04-27 06:50:25.121306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.439 #65 NEW cov: 11714 ft: 14781 corp: 46/1506b lim: 40 exec/s: 32 rss: 70Mb L: 38/39 MS: 1 CopyPart- 00:07:55.439 #65 DONE cov: 11714 ft: 14781 corp: 46/1506b lim: 40 exec/s: 32 rss: 70Mb 00:07:55.439 ###### Recommended dictionary. ###### 00:07:55.439 "\000\000" # Uses: 0 00:07:55.439 "\000\002\000\000" # Uses: 0 00:07:55.439 "\000\000\000\000\000\000\000\000" # Uses: 0 00:07:55.439 ###### End of recommended dictionary. ###### 00:07:55.439 Done 65 runs in 2 second(s) 00:07:55.439 06:50:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:55.439 06:50:25 -- ../common.sh@72 -- # (( i++ )) 00:07:55.439 06:50:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.439 06:50:25 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:55.439 06:50:25 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:55.439 06:50:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.439 06:50:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.439 06:50:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:55.439 06:50:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:55.439 06:50:25 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:55.439 06:50:25 -- nvmf/run.sh@29 -- # port=4414 00:07:55.439 06:50:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:55.439 06:50:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:55.440 06:50:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.440 06:50:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:55.440 [2024-04-27 06:50:25.296108] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:55.440 [2024-04-27 06:50:25.296203] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624642 ] 00:07:55.440 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.699 [2024-04-27 06:50:25.472292] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.699 [2024-04-27 06:50:25.491904] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.699 [2024-04-27 06:50:25.492047] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.699 [2024-04-27 06:50:25.543578] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.699 [2024-04-27 06:50:25.559893] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:55.699 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.699 INFO: Seed: 2681701895 00:07:55.958 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:55.958 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:55.958 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:55.958 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.958 #2 INITED exec/s: 0 rss: 59Mb 00:07:55.958 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.958 This may also happen if the target rejected all inputs we tried so far 00:07:56.216 NEW_FUNC[1/651]: 0x4b1620 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:56.216 NEW_FUNC[2/651]: 0x4cbe90 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:56.216 #7 NEW cov: 11403 ft: 11404 corp: 2/11b lim: 35 exec/s: 0 rss: 67Mb L: 10/10 MS: 5 CrossOver-CopyPart-ChangeByte-ShuffleBytes-CMP- DE: "\001x\021\362\005\314>\212"- 00:07:56.216 [2024-04-27 06:50:25.967077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.216 [2024-04-27 06:50:25.967135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.216 NEW_FUNC[1/15]: 0x16ce0d0 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:07:56.216 NEW_FUNC[2/15]: 0x16ce310 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:07:56.216 #10 NEW cov: 11651 ft: 12135 corp: 3/23b lim: 35 exec/s: 0 rss: 67Mb L: 12/12 MS: 3 ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:07:56.216 #11 NEW cov: 11657 ft: 12283 corp: 4/33b lim: 35 exec/s: 0 rss: 67Mb L: 10/12 MS: 1 PersAutoDict- DE: "\001x\021\362\005\314>\212"- 00:07:56.216 #12 NEW cov: 11742 ft: 12619 corp: 5/43b lim: 35 exec/s: 0 rss: 67Mb L: 10/12 MS: 1 ShuffleBytes- 00:07:56.475 #18 NEW cov: 11742 ft: 12803 corp: 6/53b lim: 35 exec/s: 0 rss: 67Mb L: 10/12 MS: 1 ChangeBinInt- 00:07:56.475 [2024-04-27 06:50:26.157433] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:4 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.475 [2024-04-27 06:50:26.157473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.475 #19 NEW cov: 11749 ft: 12987 corp: 7/63b lim: 35 exec/s: 0 rss: 67Mb L: 10/12 MS: 1 ChangeBinInt- 00:07:56.475 [2024-04-27 06:50:26.207903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:4 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.475 [2024-04-27 06:50:26.207941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.475 [2024-04-27 06:50:26.208069] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000075 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.475 [2024-04-27 06:50:26.208092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.475 #20 NEW cov: 11749 ft: 13718 corp: 8/81b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 CMP- DE: "\001\000\000\000\377\377\377\377"- 00:07:56.475 [2024-04-27 06:50:26.268129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:4 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.475 [2024-04-27 06:50:26.268163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.475 [2024-04-27 06:50:26.268298] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000075 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.475 [2024-04-27 06:50:26.268316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.475 #21 NEW cov: 11749 ft: 13833 corp: 9/99b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 ChangeBit- 00:07:56.475 [2024-04-27 06:50:26.328028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.475 [2024-04-27 06:50:26.328055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.475 #22 NEW cov: 11749 ft: 13883 corp: 10/108b lim: 35 exec/s: 0 rss: 68Mb L: 9/18 MS: 1 EraseBytes- 00:07:56.734 [2024-04-27 06:50:26.378579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.734 [2024-04-27 06:50:26.378607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.734 #23 NEW cov: 11749 ft: 13925 corp: 11/126b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 CrossOver- 00:07:56.734 [2024-04-27 06:50:26.429230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.734 [2024-04-27 06:50:26.429256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.734 [2024-04-27 06:50:26.429389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.734 [2024-04-27 06:50:26.429410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.734 [2024-04-27 06:50:26.429555] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.734 [2024-04-27 06:50:26.429576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.734 #24 NEW cov: 11749 ft: 14341 corp: 12/154b lim: 35 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 CrossOver- 00:07:56.734 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:56.734 #25 NEW cov: 11772 ft: 14387 corp: 13/164b lim: 35 exec/s: 0 rss: 68Mb L: 10/28 MS: 1 CrossOver- 00:07:56.734 #26 NEW cov: 11772 ft: 14416 corp: 14/174b lim: 35 exec/s: 0 rss: 68Mb L: 10/28 MS: 1 ChangeBinInt- 00:07:56.734 [2024-04-27 06:50:26.589730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.734 [2024-04-27 06:50:26.589757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.734 [2024-04-27 06:50:26.589997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.734 [2024-04-27 06:50:26.590016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.734 #27 NEW cov: 11772 ft: 14558 corp: 15/207b lim: 35 exec/s: 27 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:57.002 #28 NEW cov: 11772 ft: 14592 corp: 16/218b lim: 35 exec/s: 28 rss: 68Mb L: 11/33 MS: 1 CrossOver- 00:07:57.002 [2024-04-27 06:50:26.679293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000085 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.002 [2024-04-27 06:50:26.679328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.002 [2024-04-27 06:50:26.679464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000085 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.002 [2024-04-27 06:50:26.679488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.002 [2024-04-27 06:50:26.679611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000085 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.002 [2024-04-27 06:50:26.679635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.002 #29 NEW cov: 11772 ft: 14672 corp: 17/245b lim: 35 exec/s: 29 rss: 68Mb L: 27/33 MS: 1 InsertRepeatedBytes- 00:07:57.002 #30 NEW cov: 11772 ft: 14719 corp: 18/255b lim: 35 exec/s: 30 rss: 68Mb L: 10/33 MS: 1 ShuffleBytes- 00:07:57.002 [2024-04-27 06:50:26.769985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.002 [2024-04-27 06:50:26.770013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.002 NEW_FUNC[1/2]: 0x4d29c0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:57.002 NEW_FUNC[2/2]: 0x116d810 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1651 00:07:57.002 #31 NEW cov: 11805 ft: 14858 corp: 19/277b lim: 35 exec/s: 31 rss: 68Mb L: 22/33 MS: 1 InsertRepeatedBytes- 00:07:57.002 [2024-04-27 06:50:26.830534] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000015 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.002 [2024-04-27 06:50:26.830563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.002 [2024-04-27 06:50:26.830704] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.002 [2024-04-27 06:50:26.830722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.002 [2024-04-27 06:50:26.830855] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.002 [2024-04-27 06:50:26.830878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.002 #37 NEW cov: 11805 ft: 15018 corp: 20/305b lim: 35 exec/s: 37 rss: 68Mb L: 28/33 MS: 1 ShuffleBytes- 00:07:57.002 [2024-04-27 06:50:26.879627] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:4 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.002 [2024-04-27 06:50:26.879659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.002 #38 NEW cov: 11805 ft: 15021 corp: 21/315b lim: 35 exec/s: 38 rss: 68Mb L: 10/33 MS: 1 ShuffleBytes- 00:07:57.261 [2024-04-27 06:50:26.930198] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.261 [2024-04-27 06:50:26.930224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.261 #39 NEW cov: 11805 ft: 15069 corp: 22/333b lim: 35 exec/s: 39 rss: 68Mb L: 18/33 MS: 1 PersAutoDict- DE: "\001\000\000\000\377\377\377\377"- 00:07:57.261 [2024-04-27 06:50:26.980693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000085 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.261 [2024-04-27 06:50:26.980724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.261 [2024-04-27 06:50:26.980865] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000041 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.261 [2024-04-27 06:50:26.980888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.261 #40 NEW cov: 11805 ft: 15084 corp: 23/355b lim: 35 exec/s: 40 rss: 68Mb L: 22/33 MS: 1 CrossOver- 00:07:57.261 #41 NEW cov: 11805 ft: 15103 corp: 24/366b lim: 35 exec/s: 41 rss: 68Mb L: 11/33 MS: 1 InsertByte- 00:07:57.261 #42 NEW cov: 11805 ft: 15190 corp: 25/377b lim: 35 exec/s: 42 rss: 69Mb L: 11/33 MS: 1 ChangeBit- 00:07:57.261 [2024-04-27 06:50:27.131174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.261 [2024-04-27 06:50:27.131204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.261 [2024-04-27 06:50:27.131345] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000041 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.261 [2024-04-27 06:50:27.131366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.521 #43 NEW cov: 11805 ft: 15217 corp: 26/399b lim: 35 exec/s: 43 rss: 69Mb L: 22/33 MS: 1 ShuffleBytes- 00:07:57.521 #49 NEW cov: 11805 ft: 15241 corp: 27/410b lim: 35 exec/s: 49 rss: 69Mb L: 11/33 MS: 1 InsertByte- 00:07:57.521 [2024-04-27 06:50:27.240893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.240923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.521 #50 NEW cov: 11805 ft: 15254 corp: 28/419b lim: 35 exec/s: 50 rss: 69Mb L: 9/33 MS: 1 ChangeBit- 00:07:57.521 [2024-04-27 06:50:27.291926] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.291956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.521 [2024-04-27 06:50:27.292044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.292071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.521 [2024-04-27 06:50:27.292207] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.292233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.521 [2024-04-27 06:50:27.292369] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.292387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.521 #51 NEW cov: 11805 ft: 15411 corp: 29/447b lim: 35 exec/s: 51 rss: 69Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:07:57.521 [2024-04-27 06:50:27.331668] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.331696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.521 [2024-04-27 06:50:27.331829] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.331848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.521 [2024-04-27 06:50:27.331985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.332010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.521 #52 NEW cov: 11805 ft: 15425 corp: 30/475b lim: 35 exec/s: 52 rss: 69Mb L: 28/33 MS: 1 CMP- DE: "\000\000"- 00:07:57.521 [2024-04-27 06:50:27.381293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.521 [2024-04-27 06:50:27.381322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.521 #53 NEW cov: 11805 ft: 15428 corp: 31/482b lim: 35 exec/s: 53 rss: 69Mb L: 7/33 MS: 1 CrossOver- 00:07:57.781 [2024-04-27 06:50:27.431793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:4 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.781 [2024-04-27 06:50:27.431825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.781 [2024-04-27 06:50:27.431954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000cc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.781 [2024-04-27 06:50:27.431974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.781 #54 NEW cov: 11805 ft: 15448 corp: 32/499b lim: 35 exec/s: 54 rss: 69Mb L: 17/33 MS: 1 CopyPart- 00:07:57.781 [2024-04-27 06:50:27.492664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000015 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.781 [2024-04-27 06:50:27.492691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.781 [2024-04-27 06:50:27.492828] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.781 [2024-04-27 06:50:27.492846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.781 [2024-04-27 06:50:27.492981] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.781 [2024-04-27 06:50:27.493007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.781 #55 NEW cov: 11805 ft: 15470 corp: 33/527b lim: 35 exec/s: 55 rss: 69Mb L: 28/33 MS: 1 ChangeByte- 00:07:57.781 #56 NEW cov: 11805 ft: 15493 corp: 34/539b lim: 35 exec/s: 56 rss: 69Mb L: 12/33 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:57.781 [2024-04-27 06:50:27.602272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.781 [2024-04-27 06:50:27.602299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.781 [2024-04-27 06:50:27.602439] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.781 [2024-04-27 06:50:27.602460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.781 #57 NEW cov: 11805 ft: 15507 corp: 35/556b lim: 35 exec/s: 28 rss: 69Mb L: 17/33 MS: 1 PersAutoDict- DE: "\001\000\000\000\377\377\377\377"- 00:07:57.781 #57 DONE cov: 11805 ft: 15507 corp: 35/556b lim: 35 exec/s: 28 rss: 69Mb 00:07:57.781 ###### Recommended dictionary. ###### 00:07:57.781 "\001x\021\362\005\314>\212" # Uses: 1 00:07:57.781 "\001\000\000\000\377\377\377\377" # Uses: 2 00:07:57.781 "\000\000" # Uses: 1 00:07:57.781 ###### End of recommended dictionary. ###### 00:07:57.781 Done 57 runs in 2 second(s) 00:07:58.041 06:50:27 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:58.041 06:50:27 -- ../common.sh@72 -- # (( i++ )) 00:07:58.041 06:50:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.041 06:50:27 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:58.041 06:50:27 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:58.041 06:50:27 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.041 06:50:27 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.041 06:50:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:58.041 06:50:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:58.041 06:50:27 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:58.041 06:50:27 -- nvmf/run.sh@29 -- # port=4415 00:07:58.041 06:50:27 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:58.041 06:50:27 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:58.041 06:50:27 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.041 06:50:27 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:58.041 [2024-04-27 06:50:27.784770] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:58.041 [2024-04-27 06:50:27.784839] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2625179 ] 00:07:58.041 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.301 [2024-04-27 06:50:27.960001] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.301 [2024-04-27 06:50:27.979331] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.301 [2024-04-27 06:50:27.979480] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.301 [2024-04-27 06:50:28.030901] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.301 [2024-04-27 06:50:28.047184] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:58.301 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.301 INFO: Seed: 874746181 00:07:58.301 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:58.301 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:58.301 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:58.301 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.301 #2 INITED exec/s: 0 rss: 59Mb 00:07:58.301 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.301 This may also happen if the target rejected all inputs we tried so far 00:07:58.301 [2024-04-27 06:50:28.112379] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.301 [2024-04-27 06:50:28.112412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.563 NEW_FUNC[1/660]: 0x4b2b60 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:58.563 NEW_FUNC[2/660]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.563 #14 NEW cov: 11455 ft: 11454 corp: 2/8b lim: 35 exec/s: 0 rss: 67Mb L: 7/7 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:58.879 NEW_FUNC[1/4]: 0x4d29c0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:58.880 NEW_FUNC[2/4]: 0xf7dd90 in posix_sock_recv /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1597 00:07:58.880 #16 NEW cov: 11596 ft: 12153 corp: 3/17b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 2 CopyPart-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:58.880 [2024-04-27 06:50:28.483556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.483592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.483649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.483663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.880 #18 NEW cov: 11602 ft: 12763 corp: 4/39b lim: 35 exec/s: 0 rss: 67Mb L: 22/22 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:58.880 [2024-04-27 06:50:28.523730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.523756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.523815] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.523829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.523884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.523898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.523957] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.523970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.880 #23 NEW cov: 11687 ft: 13514 corp: 5/69b lim: 35 exec/s: 0 rss: 67Mb L: 30/30 MS: 5 ChangeBit-InsertByte-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:58.880 [2024-04-27 06:50:28.563874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.563901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.563960] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.563974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.564026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.564040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.564095] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.564108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.880 #24 NEW cov: 11687 ft: 13570 corp: 6/100b lim: 35 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 InsertByte- 00:07:58.880 [2024-04-27 06:50:28.604060] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.604087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.604150] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.604164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.604227] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.604241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.880 #25 NEW cov: 11687 ft: 13815 corp: 7/132b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:58.880 [2024-04-27 06:50:28.644097] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.644123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.644200] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.644214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.644272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.644287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.644325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.644338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.880 #26 NEW cov: 11687 ft: 13878 corp: 8/161b lim: 35 exec/s: 0 rss: 67Mb L: 29/32 MS: 1 EraseBytes- 00:07:58.880 [2024-04-27 06:50:28.684271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.684296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.684373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.684387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.684447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.684461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.880 #27 NEW cov: 11687 ft: 13906 corp: 9/189b lim: 35 exec/s: 0 rss: 67Mb L: 28/32 MS: 1 CrossOver- 00:07:58.880 [2024-04-27 06:50:28.724379] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.724410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.724472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.724486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.724545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.724560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.880 [2024-04-27 06:50:28.724608] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.880 [2024-04-27 06:50:28.724621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.880 #28 NEW cov: 11687 ft: 13956 corp: 10/220b lim: 35 exec/s: 0 rss: 68Mb L: 31/32 MS: 1 CrossOver- 00:07:59.144 [2024-04-27 06:50:28.764420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.764447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.764508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.764522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.144 #29 NEW cov: 11687 ft: 13992 corp: 11/242b lim: 35 exec/s: 0 rss: 69Mb L: 22/32 MS: 1 ChangeByte- 00:07:59.144 [2024-04-27 06:50:28.804539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.804565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.804626] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.804641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.144 #30 NEW cov: 11687 ft: 14016 corp: 12/267b lim: 35 exec/s: 0 rss: 69Mb L: 25/32 MS: 1 InsertRepeatedBytes- 00:07:59.144 [2024-04-27 06:50:28.844319] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.844345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.144 #33 NEW cov: 11687 ft: 14095 corp: 13/278b lim: 35 exec/s: 0 rss: 69Mb L: 11/32 MS: 3 InsertByte-InsertByte-PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:59.144 [2024-04-27 06:50:28.874821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.874846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.874906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.874920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.874980] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.874993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.875051] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.875064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.144 #34 NEW cov: 11687 ft: 14125 corp: 14/309b lim: 35 exec/s: 0 rss: 69Mb L: 31/32 MS: 1 ShuffleBytes- 00:07:59.144 [2024-04-27 06:50:28.914876] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.914901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.914961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.914978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.915034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.915048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.915103] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.915117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.144 #35 NEW cov: 11687 ft: 14211 corp: 15/343b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:07:59.144 [2024-04-27 06:50:28.954885] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.954911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.954973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.954987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.144 #36 NEW cov: 11687 ft: 14225 corp: 16/366b lim: 35 exec/s: 0 rss: 69Mb L: 23/34 MS: 1 InsertRepeatedBytes- 00:07:59.144 [2024-04-27 06:50:28.995042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.995069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.995127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.995141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.995201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.995214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:28.995270] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:28.995284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.144 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.144 #37 NEW cov: 11710 ft: 14283 corp: 17/397b lim: 35 exec/s: 0 rss: 69Mb L: 31/34 MS: 1 CrossOver- 00:07:59.144 [2024-04-27 06:50:29.035269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000037d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:29.035295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:29.035355] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000017d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:29.035369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.144 [2024-04-27 06:50:29.035430] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.144 [2024-04-27 06:50:29.035444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.405 #38 NEW cov: 11710 ft: 14305 corp: 18/431b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:59.405 [2024-04-27 06:50:29.075282] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.075308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.075366] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.075380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.075443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.075456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.075513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.075526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.405 #39 NEW cov: 11710 ft: 14320 corp: 19/462b lim: 35 exec/s: 39 rss: 69Mb L: 31/34 MS: 1 ChangeBit- 00:07:59.405 [2024-04-27 06:50:29.115427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.115452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.115510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.115524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.115584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.115598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 #40 NEW cov: 11710 ft: 14357 corp: 20/496b lim: 35 exec/s: 40 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:59.405 [2024-04-27 06:50:29.155537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.155563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.155621] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.155634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.155694] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.155707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.155761] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.155774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.405 #41 NEW cov: 11710 ft: 14399 corp: 21/527b lim: 35 exec/s: 41 rss: 69Mb L: 31/34 MS: 1 ChangeBit- 00:07:59.405 [2024-04-27 06:50:29.195763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.195792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.195854] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.195867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.195928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.195941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.405 NEW_FUNC[1/1]: 0x4cbe90 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:59.405 #42 NEW cov: 11748 ft: 14453 corp: 22/557b lim: 35 exec/s: 42 rss: 69Mb L: 30/34 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:59.405 [2024-04-27 06:50:29.235710] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.235736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.235794] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000001d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.235808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.235868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.235881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 #43 NEW cov: 11748 ft: 14492 corp: 23/579b lim: 35 exec/s: 43 rss: 69Mb L: 22/34 MS: 1 CrossOver- 00:07:59.405 [2024-04-27 06:50:29.275817] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.275842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-04-27 06:50:29.275903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-04-27 06:50:29.275917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.665 #44 NEW cov: 11748 ft: 14550 corp: 24/602b lim: 35 exec/s: 44 rss: 69Mb L: 23/34 MS: 1 ChangeByte- 00:07:59.665 [2024-04-27 06:50:29.316042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.316069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.316132] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.316145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.316219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.316233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.666 #45 NEW cov: 11748 ft: 14576 corp: 25/632b lim: 35 exec/s: 45 rss: 69Mb L: 30/34 MS: 1 ChangeBit- 00:07:59.666 [2024-04-27 06:50:29.355733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.355761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.666 #46 NEW cov: 11748 ft: 14599 corp: 26/639b lim: 35 exec/s: 46 rss: 69Mb L: 7/34 MS: 1 ShuffleBytes- 00:07:59.666 [2024-04-27 06:50:29.396149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.396174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.396233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.396246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.666 #47 NEW cov: 11748 ft: 14619 corp: 27/662b lim: 35 exec/s: 47 rss: 69Mb L: 23/34 MS: 1 CopyPart- 00:07:59.666 [2024-04-27 06:50:29.436181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.436206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.436267] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.436281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.436340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.436353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.666 #48 NEW cov: 11748 ft: 14683 corp: 28/684b lim: 35 exec/s: 48 rss: 69Mb L: 22/34 MS: 1 ChangeBinInt- 00:07:59.666 [2024-04-27 06:50:29.466370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.466398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.466461] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.466474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.466532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.466545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.466606] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.466619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.666 #49 NEW cov: 11748 ft: 14730 corp: 29/715b lim: 35 exec/s: 49 rss: 70Mb L: 31/34 MS: 1 ShuffleBytes- 00:07:59.666 [2024-04-27 06:50:29.506572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000037d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.506597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.506659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000017d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.506673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.506733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.506746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.666 #50 NEW cov: 11748 ft: 14742 corp: 30/749b lim: 35 exec/s: 50 rss: 70Mb L: 34/34 MS: 1 ChangeByte- 00:07:59.666 [2024-04-27 06:50:29.546592] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.546618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.666 [2024-04-27 06:50:29.546678] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.666 [2024-04-27 06:50:29.546692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.926 #56 NEW cov: 11748 ft: 14748 corp: 31/772b lim: 35 exec/s: 56 rss: 70Mb L: 23/34 MS: 1 EraseBytes- 00:07:59.926 [2024-04-27 06:50:29.586816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.586842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.586904] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.586918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.586995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.587009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.926 #57 NEW cov: 11748 ft: 14769 corp: 32/802b lim: 35 exec/s: 57 rss: 70Mb L: 30/34 MS: 1 ShuffleBytes- 00:07:59.926 [2024-04-27 06:50:29.626963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.626988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.627049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.627063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.627121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.627135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.926 #58 NEW cov: 11748 ft: 14780 corp: 33/832b lim: 35 exec/s: 58 rss: 70Mb L: 30/34 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:59.926 [2024-04-27 06:50:29.667037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.667062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.667124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.667138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.667196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.667212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.926 #59 NEW cov: 11748 ft: 14788 corp: 34/862b lim: 35 exec/s: 59 rss: 70Mb L: 30/34 MS: 1 ChangeBinInt- 00:07:59.926 [2024-04-27 06:50:29.706844] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.706869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.706927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000001d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.706941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.926 #60 NEW cov: 11748 ft: 14886 corp: 35/882b lim: 35 exec/s: 60 rss: 70Mb L: 20/34 MS: 1 EraseBytes- 00:07:59.926 [2024-04-27 06:50:29.747291] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.747316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.747376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.747390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.747454] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.747467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.926 #61 NEW cov: 11748 ft: 14890 corp: 36/915b lim: 35 exec/s: 61 rss: 70Mb L: 33/34 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:59.926 [2024-04-27 06:50:29.787342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.787367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.787425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.926 [2024-04-27 06:50:29.787439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.926 [2024-04-27 06:50:29.787497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.927 [2024-04-27 06:50:29.787510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.927 [2024-04-27 06:50:29.787567] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.927 [2024-04-27 06:50:29.787580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.927 #62 NEW cov: 11748 ft: 14900 corp: 37/944b lim: 35 exec/s: 62 rss: 70Mb L: 29/34 MS: 1 ChangeBinInt- 00:08:00.186 [2024-04-27 06:50:29.827514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.827540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.827600] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.827617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.827675] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.827687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.827743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.827756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.186 #63 NEW cov: 11748 ft: 14907 corp: 38/973b lim: 35 exec/s: 63 rss: 70Mb L: 29/34 MS: 1 ChangeByte- 00:08:00.186 [2024-04-27 06:50:29.867202] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.867227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.186 #64 NEW cov: 11748 ft: 14936 corp: 39/980b lim: 35 exec/s: 64 rss: 70Mb L: 7/34 MS: 1 EraseBytes- 00:08:00.186 [2024-04-27 06:50:29.907801] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.907826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.907886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.907899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.907960] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.907973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.186 #65 NEW cov: 11748 ft: 14947 corp: 40/1010b lim: 35 exec/s: 65 rss: 70Mb L: 30/34 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:00.186 [2024-04-27 06:50:29.947738] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.947763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.947826] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.947840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.947899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.947912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.186 #66 NEW cov: 11748 ft: 14971 corp: 41/1032b lim: 35 exec/s: 66 rss: 70Mb L: 22/34 MS: 1 CrossOver- 00:08:00.186 [2024-04-27 06:50:29.988020] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.988044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.988106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.988119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:29.988180] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:29.988194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.186 #67 NEW cov: 11748 ft: 14976 corp: 42/1066b lim: 35 exec/s: 67 rss: 70Mb L: 34/34 MS: 1 CrossOver- 00:08:00.186 [2024-04-27 06:50:30.028114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.186 [2024-04-27 06:50:30.028140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.186 [2024-04-27 06:50:30.028200] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.187 [2024-04-27 06:50:30.028214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.187 [2024-04-27 06:50:30.028272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.187 [2024-04-27 06:50:30.028286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.187 [2024-04-27 06:50:30.028340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.187 [2024-04-27 06:50:30.028353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.187 #68 NEW cov: 11748 ft: 14986 corp: 43/1097b lim: 35 exec/s: 68 rss: 70Mb L: 31/34 MS: 1 CopyPart- 00:08:00.187 [2024-04-27 06:50:30.068068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.187 [2024-04-27 06:50:30.068093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.187 [2024-04-27 06:50:30.068156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000001d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.187 [2024-04-27 06:50:30.068170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.187 [2024-04-27 06:50:30.068229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.187 [2024-04-27 06:50:30.068243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.445 #69 NEW cov: 11748 ft: 14988 corp: 44/1119b lim: 35 exec/s: 69 rss: 70Mb L: 22/34 MS: 1 ShuffleBytes- 00:08:00.445 [2024-04-27 06:50:30.108413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.445 [2024-04-27 06:50:30.108439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.445 [2024-04-27 06:50:30.108501] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.445 [2024-04-27 06:50:30.108516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.445 [2024-04-27 06:50:30.108575] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.445 [2024-04-27 06:50:30.108590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.445 #70 NEW cov: 11748 ft: 14995 corp: 45/1149b lim: 35 exec/s: 35 rss: 70Mb L: 30/34 MS: 1 ChangeBinInt- 00:08:00.445 #70 DONE cov: 11748 ft: 14995 corp: 45/1149b lim: 35 exec/s: 35 rss: 70Mb 00:08:00.445 ###### Recommended dictionary. ###### 00:08:00.445 "\001\000\000\000\000\000\000\000" # Uses: 5 00:08:00.445 ###### End of recommended dictionary. ###### 00:08:00.445 Done 70 runs in 2 second(s) 00:08:00.445 06:50:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:00.445 06:50:30 -- ../common.sh@72 -- # (( i++ )) 00:08:00.445 06:50:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.445 06:50:30 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:00.445 06:50:30 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:00.445 06:50:30 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.445 06:50:30 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.445 06:50:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:00.445 06:50:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:00.445 06:50:30 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:00.445 06:50:30 -- nvmf/run.sh@29 -- # port=4416 00:08:00.445 06:50:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:00.445 06:50:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:00.445 06:50:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.445 06:50:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:00.445 [2024-04-27 06:50:30.288699] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:00.445 [2024-04-27 06:50:30.288780] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2625548 ] 00:08:00.445 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.704 [2024-04-27 06:50:30.471296] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.704 [2024-04-27 06:50:30.491214] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.704 [2024-04-27 06:50:30.491342] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.704 [2024-04-27 06:50:30.543090] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.704 [2024-04-27 06:50:30.559449] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:00.704 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.704 INFO: Seed: 3385754690 00:08:00.704 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:00.704 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:00.704 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:00.704 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.704 #2 INITED exec/s: 0 rss: 60Mb 00:08:00.704 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.704 This may also happen if the target rejected all inputs we tried so far 00:08:00.963 [2024-04-27 06:50:30.604164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17433981650117718513 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.963 [2024-04-27 06:50:30.604201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.963 [2024-04-27 06:50:30.604237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.963 [2024-04-27 06:50:30.604256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.963 [2024-04-27 06:50:30.604287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.963 [2024-04-27 06:50:30.604304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.963 [2024-04-27 06:50:30.604337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.963 [2024-04-27 06:50:30.604354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.223 NEW_FUNC[1/664]: 0x4b4010 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:01.223 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.223 #24 NEW cov: 11568 ft: 11563 corp: 2/86b lim: 105 exec/s: 0 rss: 67Mb L: 85/85 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:01.223 [2024-04-27 06:50:30.934969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17433981650117718513 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:30.935009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.223 [2024-04-27 06:50:30.935045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:30.935063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.223 [2024-04-27 06:50:30.935093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:30.935109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.223 [2024-04-27 06:50:30.935138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:30.935155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.223 #25 NEW cov: 11685 ft: 11916 corp: 3/184b lim: 105 exec/s: 0 rss: 67Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:01.223 [2024-04-27 06:50:31.004895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:31.004924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.223 [2024-04-27 06:50:31.004973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:31.004991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.223 #33 NEW cov: 11691 ft: 12762 corp: 4/226b lim: 105 exec/s: 0 rss: 67Mb L: 42/98 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:01.223 [2024-04-27 06:50:31.055036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:31.055065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.223 [2024-04-27 06:50:31.055099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:31.055116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.223 #34 NEW cov: 11776 ft: 13021 corp: 5/268b lim: 105 exec/s: 0 rss: 67Mb L: 42/98 MS: 1 ChangeByte- 00:08:01.223 [2024-04-27 06:50:31.115250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17433981650117718513 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:31.115284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.223 [2024-04-27 06:50:31.115334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:31.115352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.223 [2024-04-27 06:50:31.115382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:31.115406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.223 [2024-04-27 06:50:31.115435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.223 [2024-04-27 06:50:31.115452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.483 #40 NEW cov: 11776 ft: 13193 corp: 6/359b lim: 105 exec/s: 0 rss: 67Mb L: 91/98 MS: 1 CopyPart- 00:08:01.483 [2024-04-27 06:50:31.165373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.165409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.483 [2024-04-27 06:50:31.165457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.165474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.483 [2024-04-27 06:50:31.165504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.165520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.483 [2024-04-27 06:50:31.165548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.165564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.483 #44 NEW cov: 11776 ft: 13287 corp: 7/443b lim: 105 exec/s: 0 rss: 68Mb L: 84/98 MS: 4 CopyPart-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:01.483 [2024-04-27 06:50:31.215457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16497232928240231652 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.215486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.483 [2024-04-27 06:50:31.215519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17433967304490742257 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.215536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.483 [2024-04-27 06:50:31.215566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.215583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.483 #45 NEW cov: 11776 ft: 13630 corp: 8/507b lim: 105 exec/s: 0 rss: 68Mb L: 64/98 MS: 1 CrossOver- 00:08:01.483 [2024-04-27 06:50:31.285653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14829735429168882893 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.285686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.483 [2024-04-27 06:50:31.285733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14829735431805717965 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.285751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.483 [2024-04-27 06:50:31.285781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.285797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.483 #46 NEW cov: 11776 ft: 13727 corp: 9/580b lim: 105 exec/s: 0 rss: 68Mb L: 73/98 MS: 1 InsertRepeatedBytes- 00:08:01.483 [2024-04-27 06:50:31.335799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404061565668 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.335828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.483 [2024-04-27 06:50:31.335876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.483 [2024-04-27 06:50:31.335894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.483 #47 NEW cov: 11776 ft: 13787 corp: 10/622b lim: 105 exec/s: 0 rss: 68Mb L: 42/98 MS: 1 ShuffleBytes- 00:08:01.742 [2024-04-27 06:50:31.385904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.385935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 [2024-04-27 06:50:31.385982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.386000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.742 [2024-04-27 06:50:31.386030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.386045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.742 #48 NEW cov: 11776 ft: 13845 corp: 11/703b lim: 105 exec/s: 0 rss: 68Mb L: 81/98 MS: 1 CrossOver- 00:08:01.742 [2024-04-27 06:50:31.436030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.436060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 [2024-04-27 06:50:31.436110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.436130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.742 [2024-04-27 06:50:31.436160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.436177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.742 #49 NEW cov: 11776 ft: 13909 corp: 12/784b lim: 105 exec/s: 0 rss: 68Mb L: 81/98 MS: 1 ChangeByte- 00:08:01.742 [2024-04-27 06:50:31.496101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404061582564 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.496129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.742 #50 NEW cov: 11799 ft: 14454 corp: 13/806b lim: 105 exec/s: 0 rss: 68Mb L: 22/98 MS: 1 EraseBytes- 00:08:01.742 [2024-04-27 06:50:31.556353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.556382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 [2024-04-27 06:50:31.556435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.556453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.742 [2024-04-27 06:50:31.556483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.556499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.742 #51 NEW cov: 11799 ft: 14494 corp: 14/887b lim: 105 exec/s: 51 rss: 68Mb L: 81/98 MS: 1 ShuffleBytes- 00:08:01.742 [2024-04-27 06:50:31.626541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.626569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 [2024-04-27 06:50:31.626618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-04-27 06:50:31.626636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.001 #52 NEW cov: 11799 ft: 14542 corp: 15/929b lim: 105 exec/s: 52 rss: 68Mb L: 42/98 MS: 1 ShuffleBytes- 00:08:02.001 [2024-04-27 06:50:31.676705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14829735429168882893 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.676735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.001 [2024-04-27 06:50:31.676782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14829735431805717965 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.676800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.001 [2024-04-27 06:50:31.676830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.676847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.001 #53 NEW cov: 11799 ft: 14611 corp: 16/1002b lim: 105 exec/s: 53 rss: 68Mb L: 73/98 MS: 1 ChangeBit- 00:08:02.001 [2024-04-27 06:50:31.746875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14829735429168882893 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.746905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.001 [2024-04-27 06:50:31.746952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14829735431805717965 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.746973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.001 [2024-04-27 06:50:31.747004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.747020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.001 #54 NEW cov: 11799 ft: 14645 corp: 17/1075b lim: 105 exec/s: 54 rss: 68Mb L: 73/98 MS: 1 ShuffleBytes- 00:08:02.001 [2024-04-27 06:50:31.796978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.797009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.001 [2024-04-27 06:50:31.797041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.797058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.001 [2024-04-27 06:50:31.797088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:32768 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.797104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.001 #55 NEW cov: 11799 ft: 14650 corp: 18/1156b lim: 105 exec/s: 55 rss: 68Mb L: 81/98 MS: 1 ChangeBit- 00:08:02.001 [2024-04-27 06:50:31.847075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.847105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.001 [2024-04-27 06:50:31.847153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.847171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.001 [2024-04-27 06:50:31.847201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-04-27 06:50:31.847218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.001 #56 NEW cov: 11799 ft: 14656 corp: 19/1238b lim: 105 exec/s: 56 rss: 68Mb L: 82/98 MS: 1 InsertByte- 00:08:02.261 [2024-04-27 06:50:31.907222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14829735429168882893 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:31.907253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:31.907301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559406694092260 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:31.907319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.261 #57 NEW cov: 11799 ft: 14693 corp: 20/1285b lim: 105 exec/s: 57 rss: 69Mb L: 47/98 MS: 1 EraseBytes- 00:08:02.261 [2024-04-27 06:50:31.967464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14829735429168882893 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:31.967495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:31.967532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14829735431805717965 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:31.967550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:31.967580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:31.967596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.261 #58 NEW cov: 11799 ft: 14725 corp: 21/1358b lim: 105 exec/s: 58 rss: 69Mb L: 73/98 MS: 1 ShuffleBytes- 00:08:02.261 [2024-04-27 06:50:32.017586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17433981650117718513 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.017615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:32.017662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.017679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:32.017708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.017725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:32.017753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:17433981653976478193 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.017769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.261 #59 NEW cov: 11799 ft: 14768 corp: 22/1456b lim: 105 exec/s: 59 rss: 69Mb L: 98/98 MS: 1 ShuffleBytes- 00:08:02.261 [2024-04-27 06:50:32.077748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.077777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:32.077825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.077843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:32.077872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:32768 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.077888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:32.077917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:17182621696 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.077933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.261 #60 NEW cov: 11799 ft: 14847 corp: 23/1541b lim: 105 exec/s: 60 rss: 69Mb L: 85/98 MS: 1 CMP- DE: "\004\000\000\000"- 00:08:02.261 [2024-04-27 06:50:32.137915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.137945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:32.137982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.138000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.261 [2024-04-27 06:50:32.138030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:12289 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.261 [2024-04-27 06:50:32.138046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.521 #61 NEW cov: 11799 ft: 14867 corp: 24/1622b lim: 105 exec/s: 61 rss: 69Mb L: 81/98 MS: 1 ChangeByte- 00:08:02.521 [2024-04-27 06:50:32.187906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404061582564 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.187935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 #62 NEW cov: 11799 ft: 14871 corp: 25/1645b lim: 105 exec/s: 62 rss: 69Mb L: 23/98 MS: 1 InsertByte- 00:08:02.521 [2024-04-27 06:50:32.248174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14829734329657255117 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.248202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 [2024-04-27 06:50:32.248249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14829735431805717965 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.248267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.521 [2024-04-27 06:50:32.248297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.248313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.521 #63 NEW cov: 11799 ft: 14876 corp: 26/1718b lim: 105 exec/s: 63 rss: 69Mb L: 73/98 MS: 1 ChangeBit- 00:08:02.521 [2024-04-27 06:50:32.308267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.308295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 [2024-04-27 06:50:32.308343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10752 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.308360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.521 #64 NEW cov: 11799 ft: 14891 corp: 27/1762b lim: 105 exec/s: 64 rss: 69Mb L: 44/98 MS: 1 EraseBytes- 00:08:02.521 [2024-04-27 06:50:32.368501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14829735429168882893 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.368530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 [2024-04-27 06:50:32.368578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14829735431805717965 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.368595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.521 [2024-04-27 06:50:32.368625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.521 [2024-04-27 06:50:32.368645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.780 #65 NEW cov: 11799 ft: 14929 corp: 28/1835b lim: 105 exec/s: 65 rss: 69Mb L: 73/98 MS: 1 ShuffleBytes- 00:08:02.780 [2024-04-27 06:50:32.438688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16497232928240231652 len:61938 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.438716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 [2024-04-27 06:50:32.438764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17433967304490742193 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.438781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.780 [2024-04-27 06:50:32.438811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.438827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.780 #66 NEW cov: 11799 ft: 14958 corp: 29/1899b lim: 105 exec/s: 66 rss: 69Mb L: 64/98 MS: 1 ChangeBit- 00:08:02.780 [2024-04-27 06:50:32.498839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16493559404057257188 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.498868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 [2024-04-27 06:50:32.498899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.498915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.780 [2024-04-27 06:50:32.498944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:562949953421312 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.498959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.780 #67 NEW cov: 11799 ft: 14977 corp: 30/1980b lim: 105 exec/s: 67 rss: 69Mb L: 81/98 MS: 1 ChangeBit- 00:08:02.780 [2024-04-27 06:50:32.548955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14829734329657255117 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.548983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 [2024-04-27 06:50:32.549014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14774284861393718733 len:52686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.549031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.780 [2024-04-27 06:50:32.549059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16493559407081481444 len:58597 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.780 [2024-04-27 06:50:32.549074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.780 #68 NEW cov: 11799 ft: 15001 corp: 31/2054b lim: 105 exec/s: 34 rss: 69Mb L: 74/98 MS: 1 InsertByte- 00:08:02.780 #68 DONE cov: 11799 ft: 15001 corp: 31/2054b lim: 105 exec/s: 34 rss: 69Mb 00:08:02.780 ###### Recommended dictionary. ###### 00:08:02.780 "\004\000\000\000" # Uses: 0 00:08:02.780 ###### End of recommended dictionary. ###### 00:08:02.780 Done 68 runs in 2 second(s) 00:08:03.039 06:50:32 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:03.040 06:50:32 -- ../common.sh@72 -- # (( i++ )) 00:08:03.040 06:50:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.040 06:50:32 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:03.040 06:50:32 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:03.040 06:50:32 -- nvmf/run.sh@24 -- # local timen=1 00:08:03.040 06:50:32 -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.040 06:50:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:03.040 06:50:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:03.040 06:50:32 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:03.040 06:50:32 -- nvmf/run.sh@29 -- # port=4417 00:08:03.040 06:50:32 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:03.040 06:50:32 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:03.040 06:50:32 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.040 06:50:32 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:03.040 [2024-04-27 06:50:32.750308] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:03.040 [2024-04-27 06:50:32.750410] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2626011 ] 00:08:03.040 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.040 [2024-04-27 06:50:32.929324] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.298 [2024-04-27 06:50:32.948681] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:03.298 [2024-04-27 06:50:32.948828] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.298 [2024-04-27 06:50:33.000350] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.298 [2024-04-27 06:50:33.016683] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:03.298 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.298 INFO: Seed: 1548791945 00:08:03.298 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:03.298 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:03.298 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:03.298 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.298 #2 INITED exec/s: 0 rss: 59Mb 00:08:03.298 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.298 This may also happen if the target rejected all inputs we tried so far 00:08:03.298 [2024-04-27 06:50:33.083072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.298 [2024-04-27 06:50:33.083108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.298 [2024-04-27 06:50:33.083233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.298 [2024-04-27 06:50:33.083252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.298 [2024-04-27 06:50:33.083371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.298 [2024-04-27 06:50:33.083399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.298 [2024-04-27 06:50:33.083516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.298 [2024-04-27 06:50:33.083543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.557 NEW_FUNC[1/663]: 0x4b7300 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:03.557 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.557 #10 NEW cov: 11562 ft: 11594 corp: 2/116b lim: 120 exec/s: 0 rss: 67Mb L: 115/115 MS: 3 ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:08:03.557 [2024-04-27 06:50:33.424079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.557 [2024-04-27 06:50:33.424134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.557 [2024-04-27 06:50:33.424265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.557 [2024-04-27 06:50:33.424304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.557 [2024-04-27 06:50:33.424444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.557 [2024-04-27 06:50:33.424476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.557 [2024-04-27 06:50:33.424611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.557 [2024-04-27 06:50:33.424644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.816 NEW_FUNC[1/2]: 0x1c793e0 in spdk_thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1151 00:08:03.816 NEW_FUNC[2/2]: 0x1c79bc0 in thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1055 00:08:03.816 #21 NEW cov: 11706 ft: 12152 corp: 3/231b lim: 120 exec/s: 0 rss: 67Mb L: 115/115 MS: 1 CrossOver- 00:08:03.816 [2024-04-27 06:50:33.474068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.474102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.816 [2024-04-27 06:50:33.474210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.474235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.816 [2024-04-27 06:50:33.474357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.474380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.816 [2024-04-27 06:50:33.474504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.474526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.816 #22 NEW cov: 11712 ft: 12425 corp: 4/346b lim: 120 exec/s: 0 rss: 67Mb L: 115/115 MS: 1 CrossOver- 00:08:03.816 [2024-04-27 06:50:33.513877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.513912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.816 [2024-04-27 06:50:33.514007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.514029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.816 [2024-04-27 06:50:33.514147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.514170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.816 #23 NEW cov: 11797 ft: 12981 corp: 5/419b lim: 120 exec/s: 0 rss: 67Mb L: 73/115 MS: 1 EraseBytes- 00:08:03.816 [2024-04-27 06:50:33.553747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.553777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.816 [2024-04-27 06:50:33.553882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.553908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.816 #29 NEW cov: 11797 ft: 13412 corp: 6/475b lim: 120 exec/s: 0 rss: 67Mb L: 56/115 MS: 1 EraseBytes- 00:08:03.816 [2024-04-27 06:50:33.593697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.593723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.816 #30 NEW cov: 11797 ft: 14375 corp: 7/511b lim: 120 exec/s: 0 rss: 68Mb L: 36/115 MS: 1 EraseBytes- 00:08:03.816 [2024-04-27 06:50:33.634594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.634624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.816 [2024-04-27 06:50:33.634688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.816 [2024-04-27 06:50:33.634711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.816 [2024-04-27 06:50:33.634826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.817 [2024-04-27 06:50:33.634865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.817 [2024-04-27 06:50:33.635002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.817 [2024-04-27 06:50:33.635026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.817 #31 NEW cov: 11797 ft: 14495 corp: 8/626b lim: 120 exec/s: 0 rss: 68Mb L: 115/115 MS: 1 ChangeBinInt- 00:08:03.817 [2024-04-27 06:50:33.683991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.817 [2024-04-27 06:50:33.684026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.076 #32 NEW cov: 11797 ft: 14538 corp: 9/650b lim: 120 exec/s: 0 rss: 68Mb L: 24/115 MS: 1 EraseBytes- 00:08:04.076 [2024-04-27 06:50:33.735144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.735173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.735251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.735273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.735389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.735416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.735530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.735555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.076 #33 NEW cov: 11797 ft: 14641 corp: 10/766b lim: 120 exec/s: 0 rss: 68Mb L: 116/116 MS: 1 InsertByte- 00:08:04.076 [2024-04-27 06:50:33.775231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.775260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.775330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.775350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.775468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.775492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.775613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.775635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.775749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.775773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.076 #39 NEW cov: 11797 ft: 14689 corp: 11/886b lim: 120 exec/s: 0 rss: 68Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:08:04.076 [2024-04-27 06:50:33.815060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.815090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.815186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.815207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.815323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.815348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.076 [2024-04-27 06:50:33.815471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.815494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.076 #40 NEW cov: 11797 ft: 14778 corp: 12/1002b lim: 120 exec/s: 0 rss: 68Mb L: 116/120 MS: 1 InsertByte- 00:08:04.076 [2024-04-27 06:50:33.855138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.076 [2024-04-27 06:50:33.855168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.077 [2024-04-27 06:50:33.855271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.077 [2024-04-27 06:50:33.855292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.077 [2024-04-27 06:50:33.855409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.077 [2024-04-27 06:50:33.855431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.077 [2024-04-27 06:50:33.855551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.077 [2024-04-27 06:50:33.855573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.077 #41 NEW cov: 11797 ft: 14813 corp: 13/1117b lim: 120 exec/s: 0 rss: 68Mb L: 115/120 MS: 1 ChangeBinInt- 00:08:04.077 [2024-04-27 06:50:33.894702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.077 [2024-04-27 06:50:33.894727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.077 #42 NEW cov: 11797 ft: 14832 corp: 14/1153b lim: 120 exec/s: 0 rss: 68Mb L: 36/120 MS: 1 CrossOver- 00:08:04.077 [2024-04-27 06:50:33.934666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.077 [2024-04-27 06:50:33.934693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.077 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.077 #43 NEW cov: 11820 ft: 14873 corp: 15/1189b lim: 120 exec/s: 0 rss: 68Mb L: 36/120 MS: 1 CrossOver- 00:08:04.336 [2024-04-27 06:50:33.974792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.336 [2024-04-27 06:50:33.974826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.336 #44 NEW cov: 11820 ft: 14908 corp: 16/1213b lim: 120 exec/s: 0 rss: 68Mb L: 24/120 MS: 1 ChangeByte- 00:08:04.336 [2024-04-27 06:50:34.015638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.015667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.015782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.015805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.015916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.015937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.016048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216910452 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.016067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.337 #45 NEW cov: 11820 ft: 14957 corp: 17/1330b lim: 120 exec/s: 0 rss: 68Mb L: 117/120 MS: 1 InsertByte- 00:08:04.337 [2024-04-27 06:50:34.055367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.055393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.055511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8392022999170315380 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.055534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.337 #46 NEW cov: 11820 ft: 15021 corp: 18/1386b lim: 120 exec/s: 46 rss: 68Mb L: 56/120 MS: 1 ChangeBit- 00:08:04.337 [2024-04-27 06:50:34.095983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.096014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.096112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.096134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.096254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.096277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.096387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.096415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.337 #47 NEW cov: 11820 ft: 15078 corp: 19/1503b lim: 120 exec/s: 47 rss: 68Mb L: 117/120 MS: 1 CopyPart- 00:08:04.337 [2024-04-27 06:50:34.136122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.136152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.136254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.136278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.136401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.136420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.136533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.136554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.337 #48 NEW cov: 11820 ft: 15107 corp: 20/1621b lim: 120 exec/s: 48 rss: 68Mb L: 118/120 MS: 1 CopyPart- 00:08:04.337 [2024-04-27 06:50:34.176183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.176215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.176303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.176325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.176437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.176460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.176570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.176591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.337 #49 NEW cov: 11820 ft: 15120 corp: 21/1739b lim: 120 exec/s: 49 rss: 68Mb L: 118/120 MS: 1 InsertRepeatedBytes- 00:08:04.337 [2024-04-27 06:50:34.215812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.215843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.337 [2024-04-27 06:50:34.215953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.337 [2024-04-27 06:50:34.215975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.597 #50 NEW cov: 11820 ft: 15145 corp: 22/1799b lim: 120 exec/s: 50 rss: 69Mb L: 60/120 MS: 1 CMP- DE: "\000\000\000\343"- 00:08:04.597 [2024-04-27 06:50:34.266646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.266676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.266780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15381046670902285781 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.266802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.266916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.266940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.267056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.267077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.267191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.267217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.597 #51 NEW cov: 11820 ft: 15210 corp: 23/1919b lim: 120 exec/s: 51 rss: 69Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:08:04.597 [2024-04-27 06:50:34.306649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.306679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.306775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894138 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.306797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.306910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.306931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.307044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.307067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.597 #52 NEW cov: 11820 ft: 15225 corp: 24/2034b lim: 120 exec/s: 52 rss: 69Mb L: 115/120 MS: 1 ChangeByte- 00:08:04.597 [2024-04-27 06:50:34.346749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.346778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.346855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.346877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.346993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.347015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.347129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.347151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.597 #53 NEW cov: 11820 ft: 15243 corp: 25/2153b lim: 120 exec/s: 53 rss: 69Mb L: 119/120 MS: 1 InsertByte- 00:08:04.597 [2024-04-27 06:50:34.386107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.386136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.597 #54 NEW cov: 11820 ft: 15249 corp: 26/2200b lim: 120 exec/s: 54 rss: 69Mb L: 47/120 MS: 1 EraseBytes- 00:08:04.597 [2024-04-27 06:50:34.436446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9114861777597660798 len:32383 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.436473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.436599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9114861777597660798 len:32383 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.436625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.597 #55 NEW cov: 11820 ft: 15280 corp: 27/2249b lim: 120 exec/s: 55 rss: 69Mb L: 49/120 MS: 1 InsertRepeatedBytes- 00:08:04.597 [2024-04-27 06:50:34.477290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.477321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.477398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.477430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.477543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.477576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.477696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8409474447726376052 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.477712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.597 [2024-04-27 06:50:34.477829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.597 [2024-04-27 06:50:34.477851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.857 #56 NEW cov: 11820 ft: 15319 corp: 28/2369b lim: 120 exec/s: 56 rss: 69Mb L: 120/120 MS: 1 CrossOver- 00:08:04.857 [2024-04-27 06:50:34.516482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.516507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.857 #57 NEW cov: 11820 ft: 15321 corp: 29/2393b lim: 120 exec/s: 57 rss: 69Mb L: 24/120 MS: 1 ShuffleBytes- 00:08:04.857 [2024-04-27 06:50:34.557574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.557608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.557686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.557709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.557831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.557857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.557976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.557998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.558111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.558134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.857 #58 NEW cov: 11820 ft: 15328 corp: 30/2513b lim: 120 exec/s: 58 rss: 69Mb L: 120/120 MS: 1 ShuffleBytes- 00:08:04.857 [2024-04-27 06:50:34.596850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9114861777597660798 len:32383 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.596880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.597002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9114861777597660798 len:32383 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.597023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.857 #59 NEW cov: 11820 ft: 15338 corp: 31/2562b lim: 120 exec/s: 59 rss: 70Mb L: 49/120 MS: 1 ChangeBinInt- 00:08:04.857 [2024-04-27 06:50:34.637740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.637771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.637859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.637882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.637993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:64022863232106496 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.638017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.638132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.638154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.638269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.638291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.857 #60 NEW cov: 11820 ft: 15367 corp: 32/2682b lim: 120 exec/s: 60 rss: 70Mb L: 120/120 MS: 1 PersAutoDict- DE: "\000\000\000\343"- 00:08:04.857 [2024-04-27 06:50:34.676954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.676980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.857 #61 NEW cov: 11820 ft: 15372 corp: 33/2718b lim: 120 exec/s: 61 rss: 70Mb L: 36/120 MS: 1 ChangeByte- 00:08:04.857 [2024-04-27 06:50:34.727417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9114861777597660798 len:32383 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.727443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.857 [2024-04-27 06:50:34.727544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9114861777597660798 len:32383 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.857 [2024-04-27 06:50:34.727568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.117 #62 NEW cov: 11820 ft: 15381 corp: 34/2767b lim: 120 exec/s: 62 rss: 70Mb L: 49/120 MS: 1 ChangeByte- 00:08:05.117 [2024-04-27 06:50:34.777992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.778023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.117 [2024-04-27 06:50:34.778098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.778119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.117 [2024-04-27 06:50:34.778229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.778249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.117 [2024-04-27 06:50:34.778362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.778382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.117 #63 NEW cov: 11820 ft: 15409 corp: 35/2883b lim: 120 exec/s: 63 rss: 70Mb L: 116/120 MS: 1 ChangeBit- 00:08:05.117 [2024-04-27 06:50:34.817520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.817546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.117 #64 NEW cov: 11820 ft: 15431 corp: 36/2919b lim: 120 exec/s: 64 rss: 70Mb L: 36/120 MS: 1 CopyPart- 00:08:05.117 [2024-04-27 06:50:34.857536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.857561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.117 #65 NEW cov: 11820 ft: 15463 corp: 37/2966b lim: 120 exec/s: 65 rss: 70Mb L: 47/120 MS: 1 PersAutoDict- DE: "\000\000\000\343"- 00:08:05.117 [2024-04-27 06:50:34.897871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.897899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.117 [2024-04-27 06:50:34.898018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391462248235168884 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.898042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.117 #66 NEW cov: 11820 ft: 15467 corp: 38/3023b lim: 120 exec/s: 66 rss: 70Mb L: 57/120 MS: 1 InsertByte- 00:08:05.117 [2024-04-27 06:50:34.937690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.937718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.117 #67 NEW cov: 11820 ft: 15478 corp: 39/3059b lim: 120 exec/s: 67 rss: 70Mb L: 36/120 MS: 1 ChangeByte- 00:08:05.117 [2024-04-27 06:50:34.978334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.978367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.117 [2024-04-27 06:50:34.978476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.978500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.117 [2024-04-27 06:50:34.978615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.117 [2024-04-27 06:50:34.978638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.117 #68 NEW cov: 11820 ft: 15481 corp: 40/3131b lim: 120 exec/s: 68 rss: 70Mb L: 72/120 MS: 1 EraseBytes- 00:08:05.377 [2024-04-27 06:50:35.018269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.377 [2024-04-27 06:50:35.018302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.377 [2024-04-27 06:50:35.018417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.377 [2024-04-27 06:50:35.018441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.377 #69 NEW cov: 11820 ft: 15484 corp: 41/3195b lim: 120 exec/s: 69 rss: 70Mb L: 64/120 MS: 1 EraseBytes- 00:08:05.377 [2024-04-27 06:50:35.058642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.377 [2024-04-27 06:50:35.058676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.377 [2024-04-27 06:50:35.058765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.377 [2024-04-27 06:50:35.058788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.377 [2024-04-27 06:50:35.058909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.377 [2024-04-27 06:50:35.058932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.377 #70 NEW cov: 11820 ft: 15491 corp: 42/3268b lim: 120 exec/s: 35 rss: 70Mb L: 73/120 MS: 1 ShuffleBytes- 00:08:05.377 #70 DONE cov: 11820 ft: 15491 corp: 42/3268b lim: 120 exec/s: 35 rss: 70Mb 00:08:05.377 ###### Recommended dictionary. ###### 00:08:05.377 "\000\000\000\343" # Uses: 2 00:08:05.377 ###### End of recommended dictionary. ###### 00:08:05.377 Done 70 runs in 2 second(s) 00:08:05.377 06:50:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:05.377 06:50:35 -- ../common.sh@72 -- # (( i++ )) 00:08:05.377 06:50:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.377 06:50:35 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:05.377 06:50:35 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:05.377 06:50:35 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.377 06:50:35 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.377 06:50:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:05.377 06:50:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:05.377 06:50:35 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:05.377 06:50:35 -- nvmf/run.sh@29 -- # port=4418 00:08:05.377 06:50:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:05.377 06:50:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:05.377 06:50:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.377 06:50:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:05.377 [2024-04-27 06:50:35.238488] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:05.377 [2024-04-27 06:50:35.238582] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2626548 ] 00:08:05.377 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.637 [2024-04-27 06:50:35.418192] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.637 [2024-04-27 06:50:35.437753] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.637 [2024-04-27 06:50:35.437877] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.637 [2024-04-27 06:50:35.489202] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.637 [2024-04-27 06:50:35.505496] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:05.637 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.637 INFO: Seed: 4037765489 00:08:05.897 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:05.897 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:05.897 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:05.897 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.897 #2 INITED exec/s: 0 rss: 59Mb 00:08:05.897 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.897 This may also happen if the target rejected all inputs we tried so far 00:08:05.897 [2024-04-27 06:50:35.550738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:05.897 [2024-04-27 06:50:35.550767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.897 [2024-04-27 06:50:35.550801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:05.897 [2024-04-27 06:50:35.550815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.897 [2024-04-27 06:50:35.550867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:05.897 [2024-04-27 06:50:35.550881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.157 NEW_FUNC[1/663]: 0x4bab60 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:06.157 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.157 #4 NEW cov: 11537 ft: 11538 corp: 2/66b lim: 100 exec/s: 0 rss: 67Mb L: 65/65 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:06.157 [2024-04-27 06:50:35.851244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.157 [2024-04-27 06:50:35.851282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.157 #12 NEW cov: 11650 ft: 12468 corp: 3/87b lim: 100 exec/s: 0 rss: 68Mb L: 21/65 MS: 3 InsertByte-InsertRepeatedBytes-CMP- DE: "\364\377\377\377"- 00:08:06.157 [2024-04-27 06:50:35.891398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.157 [2024-04-27 06:50:35.891424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.157 [2024-04-27 06:50:35.891457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.157 [2024-04-27 06:50:35.891471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.157 #33 NEW cov: 11656 ft: 12929 corp: 4/146b lim: 100 exec/s: 0 rss: 68Mb L: 59/65 MS: 1 EraseBytes- 00:08:06.157 [2024-04-27 06:50:35.931385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.157 [2024-04-27 06:50:35.931415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.157 #34 NEW cov: 11741 ft: 13209 corp: 5/167b lim: 100 exec/s: 0 rss: 68Mb L: 21/65 MS: 1 ChangeBit- 00:08:06.157 [2024-04-27 06:50:35.971771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.157 [2024-04-27 06:50:35.971795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.157 [2024-04-27 06:50:35.971843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.157 [2024-04-27 06:50:35.971857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.157 [2024-04-27 06:50:35.971909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.157 [2024-04-27 06:50:35.971924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.157 #36 NEW cov: 11741 ft: 13346 corp: 6/228b lim: 100 exec/s: 0 rss: 68Mb L: 61/65 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:06.157 [2024-04-27 06:50:36.011626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.157 [2024-04-27 06:50:36.011651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.157 #37 NEW cov: 11741 ft: 13511 corp: 7/249b lim: 100 exec/s: 0 rss: 68Mb L: 21/65 MS: 1 ShuffleBytes- 00:08:06.157 [2024-04-27 06:50:36.051704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.157 [2024-04-27 06:50:36.051730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.417 #38 NEW cov: 11741 ft: 13621 corp: 8/270b lim: 100 exec/s: 0 rss: 68Mb L: 21/65 MS: 1 PersAutoDict- DE: "\364\377\377\377"- 00:08:06.417 [2024-04-27 06:50:36.092067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.417 [2024-04-27 06:50:36.092093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.092125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.417 [2024-04-27 06:50:36.092139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.092189] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.417 [2024-04-27 06:50:36.092201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.417 #39 NEW cov: 11741 ft: 13679 corp: 9/331b lim: 100 exec/s: 0 rss: 68Mb L: 61/65 MS: 1 CrossOver- 00:08:06.417 [2024-04-27 06:50:36.132254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.417 [2024-04-27 06:50:36.132281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.132316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.417 [2024-04-27 06:50:36.132330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.132403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.417 [2024-04-27 06:50:36.132416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.417 #40 NEW cov: 11741 ft: 13701 corp: 10/396b lim: 100 exec/s: 0 rss: 68Mb L: 65/65 MS: 1 InsertRepeatedBytes- 00:08:06.417 [2024-04-27 06:50:36.162306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.417 [2024-04-27 06:50:36.162332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.162370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.417 [2024-04-27 06:50:36.162385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.162442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.417 [2024-04-27 06:50:36.162456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.417 #41 NEW cov: 11741 ft: 13764 corp: 11/461b lim: 100 exec/s: 0 rss: 69Mb L: 65/65 MS: 1 ChangeByte- 00:08:06.417 [2024-04-27 06:50:36.202379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.417 [2024-04-27 06:50:36.202408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.202471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.417 [2024-04-27 06:50:36.202485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.202537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.417 [2024-04-27 06:50:36.202552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.417 #42 NEW cov: 11741 ft: 13785 corp: 12/522b lim: 100 exec/s: 0 rss: 69Mb L: 61/65 MS: 1 ChangeBit- 00:08:06.417 [2024-04-27 06:50:36.232302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.417 [2024-04-27 06:50:36.232329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.417 #43 NEW cov: 11741 ft: 13836 corp: 13/544b lim: 100 exec/s: 0 rss: 69Mb L: 22/65 MS: 1 InsertByte- 00:08:06.417 [2024-04-27 06:50:36.272636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.417 [2024-04-27 06:50:36.272663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.272700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.417 [2024-04-27 06:50:36.272714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.272764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.417 [2024-04-27 06:50:36.272780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.417 #44 NEW cov: 11741 ft: 13877 corp: 14/606b lim: 100 exec/s: 0 rss: 69Mb L: 62/65 MS: 1 EraseBytes- 00:08:06.417 [2024-04-27 06:50:36.312714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.417 [2024-04-27 06:50:36.312740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.312789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.417 [2024-04-27 06:50:36.312804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.417 [2024-04-27 06:50:36.312856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.677 [2024-04-27 06:50:36.312871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.678 #45 NEW cov: 11741 ft: 13930 corp: 15/668b lim: 100 exec/s: 0 rss: 69Mb L: 62/65 MS: 1 ShuffleBytes- 00:08:06.678 [2024-04-27 06:50:36.352813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.678 [2024-04-27 06:50:36.352839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.352874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.678 [2024-04-27 06:50:36.352889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.352943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.678 [2024-04-27 06:50:36.352957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.678 #46 NEW cov: 11741 ft: 13946 corp: 16/730b lim: 100 exec/s: 0 rss: 69Mb L: 62/65 MS: 1 ChangeBit- 00:08:06.678 [2024-04-27 06:50:36.392764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.678 [2024-04-27 06:50:36.392790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.678 #50 NEW cov: 11741 ft: 13994 corp: 17/765b lim: 100 exec/s: 0 rss: 69Mb L: 35/65 MS: 4 ChangeByte-InsertByte-InsertRepeatedBytes-CrossOver- 00:08:06.678 [2024-04-27 06:50:36.433153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.678 [2024-04-27 06:50:36.433180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.433225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.678 [2024-04-27 06:50:36.433238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.433290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.678 [2024-04-27 06:50:36.433304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.433357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:06.678 [2024-04-27 06:50:36.433371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.678 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.678 #51 NEW cov: 11764 ft: 14282 corp: 18/847b lim: 100 exec/s: 0 rss: 69Mb L: 82/82 MS: 1 CopyPart- 00:08:06.678 [2024-04-27 06:50:36.473192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.678 [2024-04-27 06:50:36.473222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.473276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.678 [2024-04-27 06:50:36.473291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.473345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.678 [2024-04-27 06:50:36.473358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.678 #52 NEW cov: 11764 ft: 14286 corp: 19/908b lim: 100 exec/s: 0 rss: 69Mb L: 61/82 MS: 1 ChangeBit- 00:08:06.678 [2024-04-27 06:50:36.503400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.678 [2024-04-27 06:50:36.503427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.503465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.678 [2024-04-27 06:50:36.503480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.503531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.678 [2024-04-27 06:50:36.503546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.678 [2024-04-27 06:50:36.503597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:06.678 [2024-04-27 06:50:36.503611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.678 #53 NEW cov: 11764 ft: 14314 corp: 20/995b lim: 100 exec/s: 0 rss: 69Mb L: 87/87 MS: 1 InsertRepeatedBytes- 00:08:06.678 [2024-04-27 06:50:36.543171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.678 [2024-04-27 06:50:36.543198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.678 #54 NEW cov: 11764 ft: 14321 corp: 21/1016b lim: 100 exec/s: 54 rss: 69Mb L: 21/87 MS: 1 CrossOver- 00:08:06.939 [2024-04-27 06:50:36.583475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.939 [2024-04-27 06:50:36.583502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.583535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.939 [2024-04-27 06:50:36.583550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.583602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.939 [2024-04-27 06:50:36.583616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.939 #57 NEW cov: 11764 ft: 14333 corp: 22/1094b lim: 100 exec/s: 57 rss: 69Mb L: 78/87 MS: 3 CopyPart-InsertByte-InsertRepeatedBytes- 00:08:06.939 [2024-04-27 06:50:36.613480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.939 [2024-04-27 06:50:36.613506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.613539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.939 [2024-04-27 06:50:36.613554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.939 #58 NEW cov: 11764 ft: 14369 corp: 23/1153b lim: 100 exec/s: 58 rss: 69Mb L: 59/87 MS: 1 ChangeBit- 00:08:06.939 [2024-04-27 06:50:36.653706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.939 [2024-04-27 06:50:36.653732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.653767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.939 [2024-04-27 06:50:36.653781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.653846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.939 [2024-04-27 06:50:36.653860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.939 #59 NEW cov: 11764 ft: 14402 corp: 24/1220b lim: 100 exec/s: 59 rss: 69Mb L: 67/87 MS: 1 InsertRepeatedBytes- 00:08:06.939 [2024-04-27 06:50:36.693616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.939 [2024-04-27 06:50:36.693642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.939 #60 NEW cov: 11764 ft: 14416 corp: 25/1255b lim: 100 exec/s: 60 rss: 69Mb L: 35/87 MS: 1 EraseBytes- 00:08:06.939 [2024-04-27 06:50:36.733897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.939 [2024-04-27 06:50:36.733923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.733961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.939 [2024-04-27 06:50:36.733975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.734028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.939 [2024-04-27 06:50:36.734043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.939 #61 NEW cov: 11764 ft: 14428 corp: 26/1329b lim: 100 exec/s: 61 rss: 69Mb L: 74/87 MS: 1 CopyPart- 00:08:06.939 [2024-04-27 06:50:36.774014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.939 [2024-04-27 06:50:36.774041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.774074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.939 [2024-04-27 06:50:36.774088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.774140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:06.939 [2024-04-27 06:50:36.774153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.939 #62 NEW cov: 11764 ft: 14488 corp: 27/1389b lim: 100 exec/s: 62 rss: 70Mb L: 60/87 MS: 1 InsertByte- 00:08:06.939 [2024-04-27 06:50:36.814047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:06.939 [2024-04-27 06:50:36.814073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.939 [2024-04-27 06:50:36.814105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:06.939 [2024-04-27 06:50:36.814119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.939 #63 NEW cov: 11764 ft: 14510 corp: 28/1448b lim: 100 exec/s: 63 rss: 70Mb L: 59/87 MS: 1 InsertRepeatedBytes- 00:08:07.199 [2024-04-27 06:50:36.854069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.199 [2024-04-27 06:50:36.854094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.199 #64 NEW cov: 11764 ft: 14526 corp: 29/1469b lim: 100 exec/s: 64 rss: 70Mb L: 21/87 MS: 1 ChangeByte- 00:08:07.199 [2024-04-27 06:50:36.884359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.199 [2024-04-27 06:50:36.884384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.199 [2024-04-27 06:50:36.884436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.199 [2024-04-27 06:50:36.884450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.199 [2024-04-27 06:50:36.884500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.199 [2024-04-27 06:50:36.884514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.199 #65 NEW cov: 11764 ft: 14574 corp: 30/1530b lim: 100 exec/s: 65 rss: 70Mb L: 61/87 MS: 1 ChangeByte- 00:08:07.200 [2024-04-27 06:50:36.914291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.200 [2024-04-27 06:50:36.914317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.200 #66 NEW cov: 11764 ft: 14586 corp: 31/1566b lim: 100 exec/s: 66 rss: 70Mb L: 36/87 MS: 1 InsertByte- 00:08:07.200 [2024-04-27 06:50:36.954563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.200 [2024-04-27 06:50:36.954589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.200 [2024-04-27 06:50:36.954628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.200 [2024-04-27 06:50:36.954642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.200 [2024-04-27 06:50:36.954690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.200 [2024-04-27 06:50:36.954704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.200 #67 NEW cov: 11764 ft: 14606 corp: 32/1633b lim: 100 exec/s: 67 rss: 70Mb L: 67/87 MS: 1 ShuffleBytes- 00:08:07.200 [2024-04-27 06:50:36.994691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.200 [2024-04-27 06:50:36.994717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.200 [2024-04-27 06:50:36.994756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.200 [2024-04-27 06:50:36.994770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.200 [2024-04-27 06:50:36.994820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.200 [2024-04-27 06:50:36.994833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.200 #68 NEW cov: 11764 ft: 14618 corp: 33/1695b lim: 100 exec/s: 68 rss: 70Mb L: 62/87 MS: 1 ChangeBit- 00:08:07.200 [2024-04-27 06:50:37.024767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.200 [2024-04-27 06:50:37.024792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.200 [2024-04-27 06:50:37.024833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.200 [2024-04-27 06:50:37.024849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.200 [2024-04-27 06:50:37.024900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.200 [2024-04-27 06:50:37.024913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.200 #69 NEW cov: 11764 ft: 14628 corp: 34/1756b lim: 100 exec/s: 69 rss: 70Mb L: 61/87 MS: 1 ChangeBinInt- 00:08:07.200 [2024-04-27 06:50:37.064893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.200 [2024-04-27 06:50:37.064918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.200 [2024-04-27 06:50:37.064953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.200 [2024-04-27 06:50:37.064967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.200 [2024-04-27 06:50:37.065017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.200 [2024-04-27 06:50:37.065030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.200 #70 NEW cov: 11764 ft: 14633 corp: 35/1822b lim: 100 exec/s: 70 rss: 70Mb L: 66/87 MS: 1 PersAutoDict- DE: "\364\377\377\377"- 00:08:07.459 [2024-04-27 06:50:37.105014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.459 [2024-04-27 06:50:37.105041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.105079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.460 [2024-04-27 06:50:37.105093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.105145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.460 [2024-04-27 06:50:37.105160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.460 #71 NEW cov: 11764 ft: 14640 corp: 36/1887b lim: 100 exec/s: 71 rss: 70Mb L: 65/87 MS: 1 ChangeBinInt- 00:08:07.460 [2024-04-27 06:50:37.144915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.460 [2024-04-27 06:50:37.144939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.460 #72 NEW cov: 11764 ft: 14648 corp: 37/1922b lim: 100 exec/s: 72 rss: 70Mb L: 35/87 MS: 1 ChangeBit- 00:08:07.460 [2024-04-27 06:50:37.185402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.460 [2024-04-27 06:50:37.185427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.185472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.460 [2024-04-27 06:50:37.185486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.185533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.460 [2024-04-27 06:50:37.185547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.185597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:07.460 [2024-04-27 06:50:37.185611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.460 #73 NEW cov: 11764 ft: 14673 corp: 38/2017b lim: 100 exec/s: 73 rss: 70Mb L: 95/95 MS: 1 CopyPart- 00:08:07.460 [2024-04-27 06:50:37.225392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.460 [2024-04-27 06:50:37.225421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.225469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.460 [2024-04-27 06:50:37.225482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.225531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.460 [2024-04-27 06:50:37.225543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.460 #74 NEW cov: 11764 ft: 14691 corp: 39/2082b lim: 100 exec/s: 74 rss: 70Mb L: 65/95 MS: 1 CopyPart- 00:08:07.460 [2024-04-27 06:50:37.265405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.460 [2024-04-27 06:50:37.265431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.265471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.460 [2024-04-27 06:50:37.265485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.460 #76 NEW cov: 11764 ft: 14729 corp: 40/2138b lim: 100 exec/s: 76 rss: 70Mb L: 56/95 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:07.460 [2024-04-27 06:50:37.305603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.460 [2024-04-27 06:50:37.305628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.305665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.460 [2024-04-27 06:50:37.305679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.460 [2024-04-27 06:50:37.305728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.460 [2024-04-27 06:50:37.305742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.460 #77 NEW cov: 11764 ft: 14736 corp: 41/2199b lim: 100 exec/s: 77 rss: 70Mb L: 61/95 MS: 1 ChangeBit- 00:08:07.460 [2024-04-27 06:50:37.345524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.460 [2024-04-27 06:50:37.345549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.719 #78 NEW cov: 11764 ft: 14791 corp: 42/2221b lim: 100 exec/s: 78 rss: 70Mb L: 22/95 MS: 1 CMP- DE: "I\251U\324\370\021x\000"- 00:08:07.719 [2024-04-27 06:50:37.385852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.719 [2024-04-27 06:50:37.385877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.719 [2024-04-27 06:50:37.385911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.719 [2024-04-27 06:50:37.385924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.719 [2024-04-27 06:50:37.385971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.719 [2024-04-27 06:50:37.385985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.719 #79 NEW cov: 11764 ft: 14798 corp: 43/2288b lim: 100 exec/s: 79 rss: 70Mb L: 67/95 MS: 1 PersAutoDict- DE: "I\251U\324\370\021x\000"- 00:08:07.719 [2024-04-27 06:50:37.426073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.719 [2024-04-27 06:50:37.426099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.719 [2024-04-27 06:50:37.426146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.719 [2024-04-27 06:50:37.426160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.719 [2024-04-27 06:50:37.426209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.719 [2024-04-27 06:50:37.426222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.719 [2024-04-27 06:50:37.426271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:07.719 [2024-04-27 06:50:37.426286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.719 #80 NEW cov: 11764 ft: 14828 corp: 44/2369b lim: 100 exec/s: 80 rss: 70Mb L: 81/95 MS: 1 InsertRepeatedBytes- 00:08:07.719 [2024-04-27 06:50:37.466115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.719 [2024-04-27 06:50:37.466140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.719 [2024-04-27 06:50:37.466200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.720 [2024-04-27 06:50:37.466215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.720 [2024-04-27 06:50:37.466266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.720 [2024-04-27 06:50:37.466280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.720 #81 NEW cov: 11764 ft: 14834 corp: 45/2431b lim: 100 exec/s: 81 rss: 70Mb L: 62/95 MS: 1 InsertByte- 00:08:07.720 [2024-04-27 06:50:37.496347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.720 [2024-04-27 06:50:37.496375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.720 [2024-04-27 06:50:37.496432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:07.720 [2024-04-27 06:50:37.496448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.720 [2024-04-27 06:50:37.496499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:07.720 [2024-04-27 06:50:37.496514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.720 [2024-04-27 06:50:37.496565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:07.720 [2024-04-27 06:50:37.496577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.720 #82 NEW cov: 11764 ft: 14841 corp: 46/2517b lim: 100 exec/s: 82 rss: 70Mb L: 86/95 MS: 1 CopyPart- 00:08:07.720 [2024-04-27 06:50:37.536134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:07.720 [2024-04-27 06:50:37.536159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.720 #83 NEW cov: 11764 ft: 14844 corp: 47/2538b lim: 100 exec/s: 41 rss: 70Mb L: 21/95 MS: 1 ChangeBinInt- 00:08:07.720 #83 DONE cov: 11764 ft: 14844 corp: 47/2538b lim: 100 exec/s: 41 rss: 70Mb 00:08:07.720 ###### Recommended dictionary. ###### 00:08:07.720 "\364\377\377\377" # Uses: 2 00:08:07.720 "I\251U\324\370\021x\000" # Uses: 1 00:08:07.720 ###### End of recommended dictionary. ###### 00:08:07.720 Done 83 runs in 2 second(s) 00:08:07.979 06:50:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:07.979 06:50:37 -- ../common.sh@72 -- # (( i++ )) 00:08:07.979 06:50:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.979 06:50:37 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:07.979 06:50:37 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:07.979 06:50:37 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.979 06:50:37 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.979 06:50:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:07.979 06:50:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:07.979 06:50:37 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:07.979 06:50:37 -- nvmf/run.sh@29 -- # port=4419 00:08:07.979 06:50:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:07.979 06:50:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:07.979 06:50:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.979 06:50:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:07.979 [2024-04-27 06:50:37.716947] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:07.979 [2024-04-27 06:50:37.717029] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2626847 ] 00:08:07.979 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.238 [2024-04-27 06:50:37.891717] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.238 [2024-04-27 06:50:37.911006] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.238 [2024-04-27 06:50:37.911145] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.238 [2024-04-27 06:50:37.962572] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.238 [2024-04-27 06:50:37.978896] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:08.238 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.238 INFO: Seed: 2215813545 00:08:08.238 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:08.238 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:08.238 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:08.238 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.238 #2 INITED exec/s: 0 rss: 59Mb 00:08:08.238 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.238 This may also happen if the target rejected all inputs we tried so far 00:08:08.238 [2024-04-27 06:50:38.044778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762421779893774 len:3599 00:08:08.238 [2024-04-27 06:50:38.044822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.238 [2024-04-27 06:50:38.044952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:08.238 [2024-04-27 06:50:38.044973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.497 NEW_FUNC[1/663]: 0x4bdb20 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:08.497 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.497 #6 NEW cov: 11515 ft: 11514 corp: 2/25b lim: 50 exec/s: 0 rss: 67Mb L: 24/24 MS: 4 ChangeByte-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:08.497 [2024-04-27 06:50:38.385663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:08.497 [2024-04-27 06:50:38.385710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.497 [2024-04-27 06:50:38.385842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:08.497 [2024-04-27 06:50:38.385867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.756 #9 NEW cov: 11628 ft: 12186 corp: 3/50b lim: 50 exec/s: 0 rss: 67Mb L: 25/25 MS: 3 CrossOver-ChangeByte-CrossOver- 00:08:08.756 [2024-04-27 06:50:38.425642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:08.756 [2024-04-27 06:50:38.425673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.756 [2024-04-27 06:50:38.425792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:08.756 [2024-04-27 06:50:38.425815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.756 #10 NEW cov: 11634 ft: 12396 corp: 4/75b lim: 50 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CopyPart- 00:08:08.756 [2024-04-27 06:50:38.465890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1020080769185877518 len:3599 00:08:08.757 [2024-04-27 06:50:38.465917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.757 [2024-04-27 06:50:38.466041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:08.757 [2024-04-27 06:50:38.466066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.757 #11 NEW cov: 11719 ft: 12752 corp: 5/101b lim: 50 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 InsertByte- 00:08:08.757 [2024-04-27 06:50:38.505781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762421779893774 len:3599 00:08:08.757 [2024-04-27 06:50:38.505812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.757 [2024-04-27 06:50:38.505929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17506039205536337422 len:3599 00:08:08.757 [2024-04-27 06:50:38.505953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.757 #17 NEW cov: 11719 ft: 12865 corp: 6/125b lim: 50 exec/s: 0 rss: 67Mb L: 24/26 MS: 1 ChangeBinInt- 00:08:08.757 [2024-04-27 06:50:38.545788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:08.757 [2024-04-27 06:50:38.545820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.757 [2024-04-27 06:50:38.545942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:08.757 [2024-04-27 06:50:38.545966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.757 #18 NEW cov: 11719 ft: 12935 corp: 7/149b lim: 50 exec/s: 0 rss: 68Mb L: 24/26 MS: 1 EraseBytes- 00:08:08.757 [2024-04-27 06:50:38.586117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762421779893774 len:3599 00:08:08.757 [2024-04-27 06:50:38.586151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.757 [2024-04-27 06:50:38.586263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:08.757 [2024-04-27 06:50:38.586284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.757 #19 NEW cov: 11719 ft: 12978 corp: 8/173b lim: 50 exec/s: 0 rss: 68Mb L: 24/26 MS: 1 CrossOver- 00:08:08.757 [2024-04-27 06:50:38.626201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:08.757 [2024-04-27 06:50:38.626234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.757 [2024-04-27 06:50:38.626346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:08.757 [2024-04-27 06:50:38.626370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.757 #20 NEW cov: 11719 ft: 13001 corp: 9/198b lim: 50 exec/s: 0 rss: 68Mb L: 25/26 MS: 1 ChangeBinInt- 00:08:09.016 [2024-04-27 06:50:38.666369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047102366950100494 len:3599 00:08:09.016 [2024-04-27 06:50:38.666400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.016 [2024-04-27 06:50:38.666523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.016 [2024-04-27 06:50:38.666542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.016 #21 NEW cov: 11719 ft: 13024 corp: 10/221b lim: 50 exec/s: 0 rss: 68Mb L: 23/26 MS: 1 CrossOver- 00:08:09.016 [2024-04-27 06:50:38.706075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047102366950100494 len:3599 00:08:09.016 [2024-04-27 06:50:38.706101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.016 [2024-04-27 06:50:38.706188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.016 [2024-04-27 06:50:38.706211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.016 #27 NEW cov: 11719 ft: 13085 corp: 11/244b lim: 50 exec/s: 0 rss: 68Mb L: 23/26 MS: 1 ShuffleBytes- 00:08:09.016 [2024-04-27 06:50:38.746623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:09.016 [2024-04-27 06:50:38.746664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.016 [2024-04-27 06:50:38.746780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.016 [2024-04-27 06:50:38.746803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.017 #28 NEW cov: 11719 ft: 13115 corp: 12/268b lim: 50 exec/s: 0 rss: 68Mb L: 24/26 MS: 1 CopyPart- 00:08:09.017 [2024-04-27 06:50:38.786661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:09.017 [2024-04-27 06:50:38.786692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.017 [2024-04-27 06:50:38.786779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.017 [2024-04-27 06:50:38.786804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.017 [2024-04-27 06:50:38.786873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3599 00:08:09.017 [2024-04-27 06:50:38.786899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.017 #29 NEW cov: 11719 ft: 13412 corp: 13/304b lim: 50 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 CrossOver- 00:08:09.017 [2024-04-27 06:50:38.837193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762421779893774 len:3599 00:08:09.017 [2024-04-27 06:50:38.837226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.017 [2024-04-27 06:50:38.837307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012771250185833998 len:5655 00:08:09.017 [2024-04-27 06:50:38.837328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.017 [2024-04-27 06:50:38.837442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:5655 00:08:09.017 [2024-04-27 06:50:38.837465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.017 [2024-04-27 06:50:38.837582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419867817486 len:3599 00:08:09.017 [2024-04-27 06:50:38.837606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.017 #30 NEW cov: 11719 ft: 13748 corp: 14/345b lim: 50 exec/s: 0 rss: 68Mb L: 41/41 MS: 1 InsertRepeatedBytes- 00:08:09.017 [2024-04-27 06:50:38.876947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419785436686 len:3599 00:08:09.017 [2024-04-27 06:50:38.876978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.017 [2024-04-27 06:50:38.877078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.017 [2024-04-27 06:50:38.877103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.017 #31 NEW cov: 11719 ft: 13832 corp: 15/370b lim: 50 exec/s: 0 rss: 68Mb L: 25/41 MS: 1 ChangeByte- 00:08:09.017 [2024-04-27 06:50:38.906630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762423792173312 len:3599 00:08:09.017 [2024-04-27 06:50:38.906661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.017 [2024-04-27 06:50:38.906754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.017 [2024-04-27 06:50:38.906773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.276 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.276 #32 NEW cov: 11742 ft: 13858 corp: 16/394b lim: 50 exec/s: 0 rss: 68Mb L: 24/41 MS: 1 CMP- DE: "\377\377\001\000"- 00:08:09.276 [2024-04-27 06:50:38.957531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:09.276 [2024-04-27 06:50:38.957565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.276 [2024-04-27 06:50:38.957653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419741068814 len:3599 00:08:09.276 [2024-04-27 06:50:38.957675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.276 [2024-04-27 06:50:38.957807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3599 00:08:09.276 [2024-04-27 06:50:38.957831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.276 [2024-04-27 06:50:38.957951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419733073422 len:3599 00:08:09.276 [2024-04-27 06:50:38.957976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.276 #33 NEW cov: 11742 ft: 13872 corp: 17/435b lim: 50 exec/s: 0 rss: 68Mb L: 41/41 MS: 1 CrossOver- 00:08:09.276 [2024-04-27 06:50:38.997078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762423792173312 len:3599 00:08:09.276 [2024-04-27 06:50:38.997108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.277 [2024-04-27 06:50:38.997202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.277 [2024-04-27 06:50:38.997223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.277 #34 NEW cov: 11742 ft: 13881 corp: 18/459b lim: 50 exec/s: 34 rss: 68Mb L: 24/41 MS: 1 ShuffleBytes- 00:08:09.277 [2024-04-27 06:50:39.037382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:09.277 [2024-04-27 06:50:39.037419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.277 [2024-04-27 06:50:39.037541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3600 00:08:09.277 [2024-04-27 06:50:39.037561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.277 #35 NEW cov: 11742 ft: 13913 corp: 19/484b lim: 50 exec/s: 35 rss: 68Mb L: 25/41 MS: 1 ChangeBinInt- 00:08:09.277 [2024-04-27 06:50:39.067547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:09.277 [2024-04-27 06:50:39.067576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.277 [2024-04-27 06:50:39.067662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.277 [2024-04-27 06:50:39.067685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.277 #36 NEW cov: 11742 ft: 13919 corp: 20/508b lim: 50 exec/s: 36 rss: 69Mb L: 24/41 MS: 1 ChangeByte- 00:08:09.277 [2024-04-27 06:50:39.107952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762868456492558 len:3599 00:08:09.277 [2024-04-27 06:50:39.107985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.277 [2024-04-27 06:50:39.108101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762454092811790 len:5655 00:08:09.277 [2024-04-27 06:50:39.108126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.277 [2024-04-27 06:50:39.108247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:5655 00:08:09.277 [2024-04-27 06:50:39.108272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.277 [2024-04-27 06:50:39.108393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419867817494 len:3599 00:08:09.277 [2024-04-27 06:50:39.108424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.277 #37 NEW cov: 11742 ft: 13975 corp: 21/550b lim: 50 exec/s: 37 rss: 69Mb L: 42/42 MS: 1 InsertByte- 00:08:09.277 [2024-04-27 06:50:39.147952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762421779893774 len:3599 00:08:09.277 [2024-04-27 06:50:39.147983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.277 [2024-04-27 06:50:39.148062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.277 [2024-04-27 06:50:39.148086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.277 [2024-04-27 06:50:39.148197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3827 00:08:09.277 [2024-04-27 06:50:39.148218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.537 #43 NEW cov: 11742 ft: 13984 corp: 22/589b lim: 50 exec/s: 43 rss: 69Mb L: 39/42 MS: 1 CrossOver- 00:08:09.537 [2024-04-27 06:50:39.187814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:09.537 [2024-04-27 06:50:39.187843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.187920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012763076871065102 len:3599 00:08:09.537 [2024-04-27 06:50:39.187947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.188067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3599 00:08:09.537 [2024-04-27 06:50:39.188092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.188209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419733073422 len:3599 00:08:09.537 [2024-04-27 06:50:39.188234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.537 #44 NEW cov: 11742 ft: 14009 corp: 23/631b lim: 50 exec/s: 44 rss: 69Mb L: 42/42 MS: 1 InsertByte- 00:08:09.537 [2024-04-27 06:50:39.228015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762421779893774 len:3599 00:08:09.537 [2024-04-27 06:50:39.228045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.228167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.537 [2024-04-27 06:50:39.228189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.228317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3599 00:08:09.537 [2024-04-27 06:50:39.228342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.537 #45 NEW cov: 11742 ft: 14059 corp: 24/664b lim: 50 exec/s: 45 rss: 69Mb L: 33/42 MS: 1 CopyPart- 00:08:09.537 [2024-04-27 06:50:39.278256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:09.537 [2024-04-27 06:50:39.278294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.278419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1013014083341782542 len:61939 00:08:09.537 [2024-04-27 06:50:39.278454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.537 #46 NEW cov: 11742 ft: 14136 corp: 25/688b lim: 50 exec/s: 46 rss: 69Mb L: 24/42 MS: 1 ChangeBinInt- 00:08:09.537 [2024-04-27 06:50:39.317930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3605 00:08:09.537 [2024-04-27 06:50:39.317955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.318077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.537 [2024-04-27 06:50:39.318097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.537 #47 NEW cov: 11742 ft: 14222 corp: 26/712b lim: 50 exec/s: 47 rss: 69Mb L: 24/42 MS: 1 ChangeBinInt- 00:08:09.537 [2024-04-27 06:50:39.358419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762868456492558 len:3599 00:08:09.537 [2024-04-27 06:50:39.358453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.358560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762454092811790 len:5655 00:08:09.537 [2024-04-27 06:50:39.358580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.358692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:72081878153625599 len:5655 00:08:09.537 [2024-04-27 06:50:39.358715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.358834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1591483802437686806 len:3599 00:08:09.537 [2024-04-27 06:50:39.358857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.537 #48 NEW cov: 11742 ft: 14232 corp: 27/758b lim: 50 exec/s: 48 rss: 69Mb L: 46/46 MS: 1 PersAutoDict- DE: "\377\377\001\000"- 00:08:09.537 [2024-04-27 06:50:39.418679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047102366950100494 len:3599 00:08:09.537 [2024-04-27 06:50:39.418713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.537 [2024-04-27 06:50:39.418843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.537 [2024-04-27 06:50:39.418867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.797 #49 NEW cov: 11742 ft: 14254 corp: 28/781b lim: 50 exec/s: 49 rss: 69Mb L: 23/46 MS: 1 CopyPart- 00:08:09.797 [2024-04-27 06:50:39.469334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762421779893774 len:3599 00:08:09.797 [2024-04-27 06:50:39.469367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.797 [2024-04-27 06:50:39.469456] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012763398985616910 len:3599 00:08:09.797 [2024-04-27 06:50:39.469481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.797 [2024-04-27 06:50:39.469600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3599 00:08:09.797 [2024-04-27 06:50:39.469624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.797 [2024-04-27 06:50:39.469742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419733073422 len:3599 00:08:09.797 [2024-04-27 06:50:39.469766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.797 [2024-04-27 06:50:39.469879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:1012762423573213672 len:3721 00:08:09.797 [2024-04-27 06:50:39.469902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:09.797 #50 NEW cov: 11742 ft: 14379 corp: 29/831b lim: 50 exec/s: 50 rss: 69Mb L: 50/50 MS: 1 CopyPart- 00:08:09.797 [2024-04-27 06:50:39.518669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012798705663610382 len:3599 00:08:09.797 [2024-04-27 06:50:39.518700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.798 [2024-04-27 06:50:39.518813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:09.798 [2024-04-27 06:50:39.518837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.798 #51 NEW cov: 11742 ft: 14396 corp: 30/855b lim: 50 exec/s: 51 rss: 69Mb L: 24/50 MS: 1 ChangeByte- 00:08:09.798 [2024-04-27 06:50:39.569134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047102366950100494 len:3599 00:08:09.798 [2024-04-27 06:50:39.569167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.798 [2024-04-27 06:50:39.569288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:72073047566123007 len:3599 00:08:09.798 [2024-04-27 06:50:39.569326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.798 #52 NEW cov: 11742 ft: 14405 corp: 31/882b lim: 50 exec/s: 52 rss: 69Mb L: 27/50 MS: 1 PersAutoDict- DE: "\377\377\001\000"- 00:08:09.798 [2024-04-27 06:50:39.608878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047102366950100494 len:3599 00:08:09.798 [2024-04-27 06:50:39.608911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.798 [2024-04-27 06:50:39.609019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073544 len:3599 00:08:09.798 [2024-04-27 06:50:39.609040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.798 [2024-04-27 06:50:39.609155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3599 00:08:09.798 [2024-04-27 06:50:39.609178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.798 #53 NEW cov: 11742 ft: 14416 corp: 32/920b lim: 50 exec/s: 53 rss: 70Mb L: 38/50 MS: 1 CrossOver- 00:08:09.798 [2024-04-27 06:50:39.669794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762419791400462 len:3599 00:08:09.798 [2024-04-27 06:50:39.669825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.798 [2024-04-27 06:50:39.669945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012763076871065102 len:3599 00:08:09.798 [2024-04-27 06:50:39.669968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.798 [2024-04-27 06:50:39.670092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3599 00:08:09.798 [2024-04-27 06:50:39.670117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.798 [2024-04-27 06:50:39.670235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419733073422 len:3599 00:08:09.798 [2024-04-27 06:50:39.670268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.057 #54 NEW cov: 11742 ft: 14487 corp: 33/962b lim: 50 exec/s: 54 rss: 70Mb L: 42/50 MS: 1 ShuffleBytes- 00:08:10.057 [2024-04-27 06:50:39.719347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012833888047205902 len:3599 00:08:10.057 [2024-04-27 06:50:39.719382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.057 [2024-04-27 06:50:39.719504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:10.057 [2024-04-27 06:50:39.719530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.057 #55 NEW cov: 11742 ft: 14546 corp: 34/987b lim: 50 exec/s: 55 rss: 70Mb L: 25/50 MS: 1 InsertByte- 00:08:10.057 [2024-04-27 06:50:39.770069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762421779893774 len:3741 00:08:10.057 [2024-04-27 06:50:39.770101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.057 [2024-04-27 06:50:39.770215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11285066962739960988 len:40093 00:08:10.057 [2024-04-27 06:50:39.770236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.057 [2024-04-27 06:50:39.770357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11285066962739960988 len:39951 00:08:10.057 [2024-04-27 06:50:39.770380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.057 [2024-04-27 06:50:39.770502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419733073422 len:3599 00:08:10.057 [2024-04-27 06:50:39.770525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.057 #56 NEW cov: 11742 ft: 14624 corp: 35/1031b lim: 50 exec/s: 56 rss: 70Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:08:10.057 [2024-04-27 06:50:39.810152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012763031665249806 len:3599 00:08:10.057 [2024-04-27 06:50:39.810186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.057 [2024-04-27 06:50:39.810309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11285066962739960988 len:40093 00:08:10.057 [2024-04-27 06:50:39.810331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.057 [2024-04-27 06:50:39.810460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11285066962739960988 len:39951 00:08:10.057 [2024-04-27 06:50:39.810482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.057 [2024-04-27 06:50:39.810611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419733073422 len:3599 00:08:10.057 [2024-04-27 06:50:39.810635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.057 #57 NEW cov: 11742 ft: 14662 corp: 36/1075b lim: 50 exec/s: 57 rss: 70Mb L: 44/50 MS: 1 ShuffleBytes- 00:08:10.057 [2024-04-27 06:50:39.870019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047102366966877710 len:3599 00:08:10.057 [2024-04-27 06:50:39.870053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.057 [2024-04-27 06:50:39.870175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 00:08:10.057 [2024-04-27 06:50:39.870200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.058 #58 NEW cov: 11742 ft: 14719 corp: 37/1098b lim: 50 exec/s: 58 rss: 70Mb L: 23/50 MS: 1 ChangeBinInt- 00:08:10.058 [2024-04-27 06:50:39.920460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:9803788892360281617 len:3599 00:08:10.058 [2024-04-27 06:50:39.920491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.058 [2024-04-27 06:50:39.920578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9803788892360281614 len:3599 00:08:10.058 [2024-04-27 06:50:39.920599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.058 [2024-04-27 06:50:39.920715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762422299987470 len:3599 00:08:10.058 [2024-04-27 06:50:39.920739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.058 [2024-04-27 06:50:39.920850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419733073422 len:3599 00:08:10.058 [2024-04-27 06:50:39.920871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.058 #59 NEW cov: 11742 ft: 14734 corp: 38/1143b lim: 50 exec/s: 59 rss: 70Mb L: 45/50 MS: 1 CopyPart- 00:08:10.317 [2024-04-27 06:50:39.970454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047102366950100494 len:3599 00:08:10.317 [2024-04-27 06:50:39.970484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.317 [2024-04-27 06:50:39.970563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1011636519826230920 len:3599 00:08:10.317 [2024-04-27 06:50:39.970585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.317 [2024-04-27 06:50:39.970705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1012762419733073422 len:3599 00:08:10.317 [2024-04-27 06:50:39.970727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.317 #60 NEW cov: 11742 ft: 14754 corp: 39/1181b lim: 50 exec/s: 60 rss: 70Mb L: 38/50 MS: 1 ChangeBit- 00:08:10.317 [2024-04-27 06:50:40.020893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1012762481909435918 len:3599 00:08:10.317 [2024-04-27 06:50:40.020927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.317 [2024-04-27 06:50:40.021032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11285066962739960988 len:40093 00:08:10.317 [2024-04-27 06:50:40.021057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.317 [2024-04-27 06:50:40.021177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11285066962739960988 len:39951 00:08:10.317 [2024-04-27 06:50:40.021200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.317 [2024-04-27 06:50:40.021323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1012762419733073422 len:3599 00:08:10.317 [2024-04-27 06:50:40.021345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.317 #61 NEW cov: 11742 ft: 14789 corp: 40/1225b lim: 50 exec/s: 30 rss: 70Mb L: 44/50 MS: 1 ChangeBit- 00:08:10.317 #61 DONE cov: 11742 ft: 14789 corp: 40/1225b lim: 50 exec/s: 30 rss: 70Mb 00:08:10.317 ###### Recommended dictionary. ###### 00:08:10.317 "\377\377\001\000" # Uses: 2 00:08:10.317 ###### End of recommended dictionary. ###### 00:08:10.317 Done 61 runs in 2 second(s) 00:08:10.317 06:50:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:10.317 06:50:40 -- ../common.sh@72 -- # (( i++ )) 00:08:10.317 06:50:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.317 06:50:40 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:10.317 06:50:40 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:10.317 06:50:40 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.317 06:50:40 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.317 06:50:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:10.317 06:50:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:10.317 06:50:40 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:10.317 06:50:40 -- nvmf/run.sh@29 -- # port=4420 00:08:10.317 06:50:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:10.317 06:50:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:10.317 06:50:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.317 06:50:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:10.317 [2024-04-27 06:50:40.184775] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:10.317 [2024-04-27 06:50:40.184828] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2627384 ] 00:08:10.577 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.577 [2024-04-27 06:50:40.360484] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.577 [2024-04-27 06:50:40.379995] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.577 [2024-04-27 06:50:40.380120] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.577 [2024-04-27 06:50:40.431714] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.577 [2024-04-27 06:50:40.448011] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:10.577 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.577 INFO: Seed: 390848621 00:08:10.835 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:10.835 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:10.835 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:10.835 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.835 #2 INITED exec/s: 0 rss: 60Mb 00:08:10.835 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.835 This may also happen if the target rejected all inputs we tried so far 00:08:10.835 [2024-04-27 06:50:40.514475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:10.835 [2024-04-27 06:50:40.514517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.835 [2024-04-27 06:50:40.514657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:10.836 [2024-04-27 06:50:40.514685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.836 [2024-04-27 06:50:40.514799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:10.836 [2024-04-27 06:50:40.514820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.836 [2024-04-27 06:50:40.514954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:10.836 [2024-04-27 06:50:40.514978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.095 NEW_FUNC[1/665]: 0x4bf6e0 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:11.095 NEW_FUNC[2/665]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.095 #18 NEW cov: 11573 ft: 11574 corp: 2/87b lim: 90 exec/s: 0 rss: 67Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:08:11.095 [2024-04-27 06:50:40.844823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.095 [2024-04-27 06:50:40.844861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.095 [2024-04-27 06:50:40.844979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.095 [2024-04-27 06:50:40.845005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.095 #21 NEW cov: 11686 ft: 12473 corp: 3/125b lim: 90 exec/s: 0 rss: 67Mb L: 38/86 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:08:11.095 [2024-04-27 06:50:40.884652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.096 [2024-04-27 06:50:40.884679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.096 #22 NEW cov: 11692 ft: 13590 corp: 4/157b lim: 90 exec/s: 0 rss: 67Mb L: 32/86 MS: 1 CrossOver- 00:08:11.096 [2024-04-27 06:50:40.925344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.096 [2024-04-27 06:50:40.925377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.096 [2024-04-27 06:50:40.925450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.096 [2024-04-27 06:50:40.925471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.096 [2024-04-27 06:50:40.925578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.096 [2024-04-27 06:50:40.925600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.096 [2024-04-27 06:50:40.925716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:11.096 [2024-04-27 06:50:40.925739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.096 #23 NEW cov: 11777 ft: 13828 corp: 5/245b lim: 90 exec/s: 0 rss: 67Mb L: 88/88 MS: 1 CrossOver- 00:08:11.096 [2024-04-27 06:50:40.975644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.096 [2024-04-27 06:50:40.975675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.096 [2024-04-27 06:50:40.975766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.096 [2024-04-27 06:50:40.975788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.096 [2024-04-27 06:50:40.975902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.096 [2024-04-27 06:50:40.975926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.096 [2024-04-27 06:50:40.976044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:11.096 [2024-04-27 06:50:40.976066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.355 #24 NEW cov: 11777 ft: 13982 corp: 6/333b lim: 90 exec/s: 0 rss: 67Mb L: 88/88 MS: 1 ChangeBinInt- 00:08:11.355 [2024-04-27 06:50:41.025029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.355 [2024-04-27 06:50:41.025057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.355 #25 NEW cov: 11777 ft: 14077 corp: 7/366b lim: 90 exec/s: 0 rss: 68Mb L: 33/88 MS: 1 InsertByte- 00:08:11.355 [2024-04-27 06:50:41.075893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.355 [2024-04-27 06:50:41.075924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.355 [2024-04-27 06:50:41.076029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.355 [2024-04-27 06:50:41.076052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.355 [2024-04-27 06:50:41.076168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.355 [2024-04-27 06:50:41.076189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.355 [2024-04-27 06:50:41.076303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:11.355 [2024-04-27 06:50:41.076326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.355 #26 NEW cov: 11777 ft: 14129 corp: 8/454b lim: 90 exec/s: 0 rss: 68Mb L: 88/88 MS: 1 CopyPart- 00:08:11.355 [2024-04-27 06:50:41.125595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.355 [2024-04-27 06:50:41.125627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.355 [2024-04-27 06:50:41.125747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.355 [2024-04-27 06:50:41.125770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.355 #28 NEW cov: 11777 ft: 14228 corp: 9/504b lim: 90 exec/s: 0 rss: 68Mb L: 50/88 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:11.355 [2024-04-27 06:50:41.165721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.355 [2024-04-27 06:50:41.165753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.355 [2024-04-27 06:50:41.165869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.355 [2024-04-27 06:50:41.165891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.355 #29 NEW cov: 11777 ft: 14298 corp: 10/542b lim: 90 exec/s: 0 rss: 68Mb L: 38/88 MS: 1 ShuffleBytes- 00:08:11.355 [2024-04-27 06:50:41.215606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.355 [2024-04-27 06:50:41.215632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.355 #30 NEW cov: 11777 ft: 14397 corp: 11/560b lim: 90 exec/s: 0 rss: 68Mb L: 18/88 MS: 1 EraseBytes- 00:08:11.614 [2024-04-27 06:50:41.266095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.614 [2024-04-27 06:50:41.266122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.615 [2024-04-27 06:50:41.266244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.615 [2024-04-27 06:50:41.266267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.615 #33 NEW cov: 11777 ft: 14434 corp: 12/603b lim: 90 exec/s: 0 rss: 68Mb L: 43/88 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:11.615 [2024-04-27 06:50:41.306168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.615 [2024-04-27 06:50:41.306195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.615 [2024-04-27 06:50:41.306327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.615 [2024-04-27 06:50:41.306350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.615 #39 NEW cov: 11777 ft: 14479 corp: 13/647b lim: 90 exec/s: 0 rss: 68Mb L: 44/88 MS: 1 InsertByte- 00:08:11.615 [2024-04-27 06:50:41.356483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.615 [2024-04-27 06:50:41.356516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.615 [2024-04-27 06:50:41.356637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.615 [2024-04-27 06:50:41.356663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.615 [2024-04-27 06:50:41.356788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.615 [2024-04-27 06:50:41.356813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.615 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.615 #40 NEW cov: 11800 ft: 14777 corp: 14/704b lim: 90 exec/s: 0 rss: 69Mb L: 57/88 MS: 1 InsertRepeatedBytes- 00:08:11.615 [2024-04-27 06:50:41.406681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.615 [2024-04-27 06:50:41.406721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.615 [2024-04-27 06:50:41.406845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.615 [2024-04-27 06:50:41.406870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.615 [2024-04-27 06:50:41.407003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.615 [2024-04-27 06:50:41.407028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.615 #41 NEW cov: 11800 ft: 14804 corp: 15/768b lim: 90 exec/s: 0 rss: 69Mb L: 64/88 MS: 1 EraseBytes- 00:08:11.615 [2024-04-27 06:50:41.446250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.615 [2024-04-27 06:50:41.446276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.615 #42 NEW cov: 11800 ft: 14847 corp: 16/789b lim: 90 exec/s: 0 rss: 69Mb L: 21/88 MS: 1 EraseBytes- 00:08:11.615 [2024-04-27 06:50:41.486860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.615 [2024-04-27 06:50:41.486889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.615 [2024-04-27 06:50:41.486995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.615 [2024-04-27 06:50:41.487020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.615 [2024-04-27 06:50:41.487142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.615 [2024-04-27 06:50:41.487162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.874 #43 NEW cov: 11800 ft: 14890 corp: 17/855b lim: 90 exec/s: 43 rss: 69Mb L: 66/88 MS: 1 InsertRepeatedBytes- 00:08:11.874 [2024-04-27 06:50:41.526432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.874 [2024-04-27 06:50:41.526457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.874 #44 NEW cov: 11800 ft: 14898 corp: 18/876b lim: 90 exec/s: 44 rss: 69Mb L: 21/88 MS: 1 CopyPart- 00:08:11.874 [2024-04-27 06:50:41.567310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.874 [2024-04-27 06:50:41.567341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.874 [2024-04-27 06:50:41.567428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.874 [2024-04-27 06:50:41.567451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.874 [2024-04-27 06:50:41.567580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.874 [2024-04-27 06:50:41.567605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.874 [2024-04-27 06:50:41.567727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:11.874 [2024-04-27 06:50:41.567754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.874 #45 NEW cov: 11800 ft: 14921 corp: 19/965b lim: 90 exec/s: 45 rss: 69Mb L: 89/89 MS: 1 CrossOver- 00:08:11.874 [2024-04-27 06:50:41.607083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.874 [2024-04-27 06:50:41.607109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.874 [2024-04-27 06:50:41.607242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.874 [2024-04-27 06:50:41.607267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.874 #46 NEW cov: 11800 ft: 14948 corp: 20/1003b lim: 90 exec/s: 46 rss: 69Mb L: 38/89 MS: 1 ChangeBit- 00:08:11.875 [2024-04-27 06:50:41.647689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.875 [2024-04-27 06:50:41.647724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.875 [2024-04-27 06:50:41.647847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.875 [2024-04-27 06:50:41.647869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.875 [2024-04-27 06:50:41.647986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.875 [2024-04-27 06:50:41.648008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.875 [2024-04-27 06:50:41.648127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:11.875 [2024-04-27 06:50:41.648147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.875 #47 NEW cov: 11800 ft: 14966 corp: 21/1090b lim: 90 exec/s: 47 rss: 69Mb L: 87/89 MS: 1 InsertRepeatedBytes- 00:08:11.875 [2024-04-27 06:50:41.687252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.875 [2024-04-27 06:50:41.687283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.875 [2024-04-27 06:50:41.687375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.875 [2024-04-27 06:50:41.687401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.875 #48 NEW cov: 11800 ft: 14990 corp: 22/1140b lim: 90 exec/s: 48 rss: 69Mb L: 50/89 MS: 1 ChangeBit- 00:08:11.875 [2024-04-27 06:50:41.727150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.875 [2024-04-27 06:50:41.727175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.875 #49 NEW cov: 11800 ft: 15004 corp: 23/1162b lim: 90 exec/s: 49 rss: 69Mb L: 22/89 MS: 1 InsertByte- 00:08:11.875 [2024-04-27 06:50:41.767702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:11.875 [2024-04-27 06:50:41.767748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.875 [2024-04-27 06:50:41.767861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:11.875 [2024-04-27 06:50:41.767885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.875 [2024-04-27 06:50:41.768001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:11.875 [2024-04-27 06:50:41.768023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.134 #50 NEW cov: 11800 ft: 15086 corp: 24/1228b lim: 90 exec/s: 50 rss: 69Mb L: 66/89 MS: 1 ChangeByte- 00:08:12.134 [2024-04-27 06:50:41.807676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.134 [2024-04-27 06:50:41.807700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.134 [2024-04-27 06:50:41.807840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.134 [2024-04-27 06:50:41.807865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.134 #51 NEW cov: 11800 ft: 15110 corp: 25/1266b lim: 90 exec/s: 51 rss: 69Mb L: 38/89 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:12.134 [2024-04-27 06:50:41.847735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.134 [2024-04-27 06:50:41.847766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.134 [2024-04-27 06:50:41.847891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.134 [2024-04-27 06:50:41.847915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.134 #52 NEW cov: 11800 ft: 15116 corp: 26/1312b lim: 90 exec/s: 52 rss: 70Mb L: 46/89 MS: 1 CrossOver- 00:08:12.134 [2024-04-27 06:50:41.888300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.134 [2024-04-27 06:50:41.888333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.134 [2024-04-27 06:50:41.888427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.134 [2024-04-27 06:50:41.888462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.134 [2024-04-27 06:50:41.888573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:12.134 [2024-04-27 06:50:41.888596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.134 [2024-04-27 06:50:41.888680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:12.134 [2024-04-27 06:50:41.888700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.134 #53 NEW cov: 11800 ft: 15122 corp: 27/1400b lim: 90 exec/s: 53 rss: 70Mb L: 88/89 MS: 1 CrossOver- 00:08:12.134 [2024-04-27 06:50:41.927986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.134 [2024-04-27 06:50:41.928020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.134 [2024-04-27 06:50:41.928142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.134 [2024-04-27 06:50:41.928171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.134 #54 NEW cov: 11800 ft: 15148 corp: 28/1451b lim: 90 exec/s: 54 rss: 70Mb L: 51/89 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:12.134 [2024-04-27 06:50:41.968038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.134 [2024-04-27 06:50:41.968071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.134 [2024-04-27 06:50:41.968195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.134 [2024-04-27 06:50:41.968216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.135 #55 NEW cov: 11800 ft: 15158 corp: 29/1489b lim: 90 exec/s: 55 rss: 70Mb L: 38/89 MS: 1 ShuffleBytes- 00:08:12.135 [2024-04-27 06:50:42.008216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.135 [2024-04-27 06:50:42.008241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.135 [2024-04-27 06:50:42.008356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.135 [2024-04-27 06:50:42.008381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.135 #61 NEW cov: 11800 ft: 15169 corp: 30/1532b lim: 90 exec/s: 61 rss: 70Mb L: 43/89 MS: 1 CopyPart- 00:08:12.394 [2024-04-27 06:50:42.048321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.394 [2024-04-27 06:50:42.048353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.394 [2024-04-27 06:50:42.048483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.394 [2024-04-27 06:50:42.048509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.394 #62 NEW cov: 11800 ft: 15179 corp: 31/1570b lim: 90 exec/s: 62 rss: 70Mb L: 38/89 MS: 1 CMP- DE: "\001\015"- 00:08:12.394 [2024-04-27 06:50:42.088470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.394 [2024-04-27 06:50:42.088497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.394 [2024-04-27 06:50:42.088630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.394 [2024-04-27 06:50:42.088655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.394 #63 NEW cov: 11800 ft: 15222 corp: 32/1616b lim: 90 exec/s: 63 rss: 70Mb L: 46/89 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:12.394 [2024-04-27 06:50:42.138692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.394 [2024-04-27 06:50:42.138719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.394 [2024-04-27 06:50:42.138833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.394 [2024-04-27 06:50:42.138856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.394 #64 NEW cov: 11800 ft: 15236 corp: 33/1654b lim: 90 exec/s: 64 rss: 70Mb L: 38/89 MS: 1 PersAutoDict- DE: "\001\015"- 00:08:12.394 [2024-04-27 06:50:42.188766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.394 [2024-04-27 06:50:42.188796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.394 [2024-04-27 06:50:42.188907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.394 [2024-04-27 06:50:42.188929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.394 #65 NEW cov: 11800 ft: 15277 corp: 34/1700b lim: 90 exec/s: 65 rss: 70Mb L: 46/89 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:12.394 [2024-04-27 06:50:42.228965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.394 [2024-04-27 06:50:42.228990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.394 [2024-04-27 06:50:42.229121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.394 [2024-04-27 06:50:42.229145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.394 #66 NEW cov: 11800 ft: 15310 corp: 35/1747b lim: 90 exec/s: 66 rss: 70Mb L: 47/89 MS: 1 CMP- DE: "\006\000\000\000"- 00:08:12.394 [2024-04-27 06:50:42.269032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.394 [2024-04-27 06:50:42.269065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.394 [2024-04-27 06:50:42.269181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.394 [2024-04-27 06:50:42.269205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.654 #67 NEW cov: 11800 ft: 15318 corp: 36/1793b lim: 90 exec/s: 67 rss: 70Mb L: 46/89 MS: 1 ChangeBinInt- 00:08:12.654 [2024-04-27 06:50:42.309632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.654 [2024-04-27 06:50:42.309661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.309754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.654 [2024-04-27 06:50:42.309779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.309904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:12.654 [2024-04-27 06:50:42.309931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.310055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:12.654 [2024-04-27 06:50:42.310078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.654 #68 NEW cov: 11800 ft: 15402 corp: 37/1880b lim: 90 exec/s: 68 rss: 70Mb L: 87/89 MS: 1 ChangeBit- 00:08:12.654 [2024-04-27 06:50:42.349784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.654 [2024-04-27 06:50:42.349817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.349913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.654 [2024-04-27 06:50:42.349934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.350046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:12.654 [2024-04-27 06:50:42.350069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.350180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:12.654 [2024-04-27 06:50:42.350204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.654 #69 NEW cov: 11800 ft: 15404 corp: 38/1967b lim: 90 exec/s: 69 rss: 70Mb L: 87/89 MS: 1 ChangeByte- 00:08:12.654 [2024-04-27 06:50:42.389824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.654 [2024-04-27 06:50:42.389864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.389978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.654 [2024-04-27 06:50:42.390001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.654 #70 NEW cov: 11809 ft: 15529 corp: 39/2013b lim: 90 exec/s: 70 rss: 70Mb L: 46/89 MS: 1 InsertRepeatedBytes- 00:08:12.654 [2024-04-27 06:50:42.429266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.654 [2024-04-27 06:50:42.429292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.654 #71 NEW cov: 11809 ft: 15535 corp: 40/2041b lim: 90 exec/s: 71 rss: 70Mb L: 28/89 MS: 1 EraseBytes- 00:08:12.654 [2024-04-27 06:50:42.469937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:12.654 [2024-04-27 06:50:42.469970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.470079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:12.654 [2024-04-27 06:50:42.470102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.654 [2024-04-27 06:50:42.470217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:12.654 [2024-04-27 06:50:42.470240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.654 #74 NEW cov: 11809 ft: 15540 corp: 41/2096b lim: 90 exec/s: 74 rss: 70Mb L: 55/89 MS: 3 CopyPart-CopyPart-InsertRepeatedBytes- 00:08:12.654 #74 DONE cov: 11809 ft: 15540 corp: 41/2096b lim: 90 exec/s: 37 rss: 70Mb 00:08:12.654 ###### Recommended dictionary. ###### 00:08:12.654 "\000\000\000\000\000\000\000\000" # Uses: 2 00:08:12.654 "\001\015" # Uses: 1 00:08:12.654 "\003\000\000\000\000\000\000\000" # Uses: 0 00:08:12.654 "\006\000\000\000" # Uses: 0 00:08:12.654 ###### End of recommended dictionary. ###### 00:08:12.654 Done 74 runs in 2 second(s) 00:08:12.913 06:50:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:12.913 06:50:42 -- ../common.sh@72 -- # (( i++ )) 00:08:12.913 06:50:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.913 06:50:42 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:12.913 06:50:42 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:12.913 06:50:42 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.913 06:50:42 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.913 06:50:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:12.913 06:50:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:12.913 06:50:42 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:12.913 06:50:42 -- nvmf/run.sh@29 -- # port=4421 00:08:12.913 06:50:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:12.913 06:50:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:12.913 06:50:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.913 06:50:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:12.913 [2024-04-27 06:50:42.645260] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:12.913 [2024-04-27 06:50:42.645344] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2627767 ] 00:08:12.913 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.171 [2024-04-27 06:50:42.825901] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.171 [2024-04-27 06:50:42.845304] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.171 [2024-04-27 06:50:42.845452] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.171 [2024-04-27 06:50:42.896953] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.171 [2024-04-27 06:50:42.913255] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:13.171 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.171 INFO: Seed: 2854855195 00:08:13.171 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:13.171 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:13.171 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:13.171 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.171 #2 INITED exec/s: 0 rss: 59Mb 00:08:13.171 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.171 This may also happen if the target rejected all inputs we tried so far 00:08:13.171 [2024-04-27 06:50:42.989110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.172 [2024-04-27 06:50:42.989156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.430 NEW_FUNC[1/665]: 0x4c2900 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:13.430 NEW_FUNC[2/665]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.430 #28 NEW cov: 11546 ft: 11547 corp: 2/12b lim: 50 exec/s: 0 rss: 67Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:08:13.430 [2024-04-27 06:50:43.319867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.430 [2024-04-27 06:50:43.319922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.688 #29 NEW cov: 11661 ft: 12105 corp: 3/23b lim: 50 exec/s: 0 rss: 67Mb L: 11/11 MS: 1 ChangeByte- 00:08:13.688 [2024-04-27 06:50:43.369966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.688 [2024-04-27 06:50:43.369992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.688 #30 NEW cov: 11667 ft: 12479 corp: 4/34b lim: 50 exec/s: 0 rss: 67Mb L: 11/11 MS: 1 ChangeBinInt- 00:08:13.688 [2024-04-27 06:50:43.410202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.688 [2024-04-27 06:50:43.410234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.688 [2024-04-27 06:50:43.410352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:13.688 [2024-04-27 06:50:43.410372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.688 #36 NEW cov: 11752 ft: 13610 corp: 5/56b lim: 50 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 CopyPart- 00:08:13.688 [2024-04-27 06:50:43.460260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.688 [2024-04-27 06:50:43.460287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.688 #37 NEW cov: 11752 ft: 13690 corp: 6/67b lim: 50 exec/s: 0 rss: 67Mb L: 11/22 MS: 1 ChangeBinInt- 00:08:13.688 [2024-04-27 06:50:43.500605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.688 [2024-04-27 06:50:43.500639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.688 [2024-04-27 06:50:43.500752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:13.688 [2024-04-27 06:50:43.500772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.688 #38 NEW cov: 11752 ft: 13770 corp: 7/89b lim: 50 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 ChangeBinInt- 00:08:13.688 [2024-04-27 06:50:43.540330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.688 [2024-04-27 06:50:43.540363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.688 [2024-04-27 06:50:43.540483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:13.688 [2024-04-27 06:50:43.540507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.688 #39 NEW cov: 11752 ft: 13855 corp: 8/111b lim: 50 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 ChangeByte- 00:08:13.946 [2024-04-27 06:50:43.590606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.946 [2024-04-27 06:50:43.590634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.946 #40 NEW cov: 11752 ft: 13921 corp: 9/122b lim: 50 exec/s: 0 rss: 68Mb L: 11/22 MS: 1 ChangeBit- 00:08:13.946 [2024-04-27 06:50:43.641103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.946 [2024-04-27 06:50:43.641139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.946 [2024-04-27 06:50:43.641266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:13.946 [2024-04-27 06:50:43.641289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.946 #41 NEW cov: 11752 ft: 13970 corp: 10/143b lim: 50 exec/s: 0 rss: 68Mb L: 21/22 MS: 1 EraseBytes- 00:08:13.946 [2024-04-27 06:50:43.680917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.946 [2024-04-27 06:50:43.680948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.946 #42 NEW cov: 11752 ft: 14005 corp: 11/162b lim: 50 exec/s: 0 rss: 68Mb L: 19/22 MS: 1 CopyPart- 00:08:13.946 [2024-04-27 06:50:43.721272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.946 [2024-04-27 06:50:43.721305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.946 [2024-04-27 06:50:43.721422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:13.946 [2024-04-27 06:50:43.721446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.946 #43 NEW cov: 11752 ft: 14109 corp: 12/184b lim: 50 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 ShuffleBytes- 00:08:13.946 [2024-04-27 06:50:43.761086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.946 [2024-04-27 06:50:43.761117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.946 [2024-04-27 06:50:43.761206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:13.946 [2024-04-27 06:50:43.761230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.946 #44 NEW cov: 11752 ft: 14147 corp: 13/206b lim: 50 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 ChangeBit- 00:08:13.946 [2024-04-27 06:50:43.801567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.946 [2024-04-27 06:50:43.801598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.946 [2024-04-27 06:50:43.801699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:13.946 [2024-04-27 06:50:43.801723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.946 #45 NEW cov: 11752 ft: 14158 corp: 14/229b lim: 50 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 InsertByte- 00:08:13.946 [2024-04-27 06:50:43.841317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:13.946 [2024-04-27 06:50:43.841349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.206 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.206 #46 NEW cov: 11775 ft: 14194 corp: 15/241b lim: 50 exec/s: 0 rss: 68Mb L: 12/23 MS: 1 InsertByte- 00:08:14.206 [2024-04-27 06:50:43.881820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.206 [2024-04-27 06:50:43.881853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.206 [2024-04-27 06:50:43.881976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.206 [2024-04-27 06:50:43.881999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.206 #47 NEW cov: 11775 ft: 14218 corp: 16/263b lim: 50 exec/s: 0 rss: 68Mb L: 22/23 MS: 1 ChangeBinInt- 00:08:14.206 [2024-04-27 06:50:43.921694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.206 [2024-04-27 06:50:43.921725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.206 #52 NEW cov: 11775 ft: 14274 corp: 17/273b lim: 50 exec/s: 0 rss: 68Mb L: 10/23 MS: 5 InsertByte-CrossOver-CrossOver-CopyPart-CopyPart- 00:08:14.206 [2024-04-27 06:50:43.962275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.206 [2024-04-27 06:50:43.962307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.206 [2024-04-27 06:50:43.962423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.206 [2024-04-27 06:50:43.962448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.206 [2024-04-27 06:50:43.962585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:14.206 [2024-04-27 06:50:43.962613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.206 #53 NEW cov: 11775 ft: 14594 corp: 18/306b lim: 50 exec/s: 53 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:14.206 [2024-04-27 06:50:44.011901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.206 [2024-04-27 06:50:44.011928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.206 #54 NEW cov: 11775 ft: 14637 corp: 19/317b lim: 50 exec/s: 54 rss: 68Mb L: 11/33 MS: 1 ChangeBinInt- 00:08:14.206 [2024-04-27 06:50:44.062683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.206 [2024-04-27 06:50:44.062717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.206 [2024-04-27 06:50:44.062846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.206 [2024-04-27 06:50:44.062869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.206 [2024-04-27 06:50:44.062997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:14.206 [2024-04-27 06:50:44.063020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.206 #55 NEW cov: 11775 ft: 14687 corp: 20/347b lim: 50 exec/s: 55 rss: 68Mb L: 30/33 MS: 1 CopyPart- 00:08:14.465 [2024-04-27 06:50:44.122310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.465 [2024-04-27 06:50:44.122340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.465 #56 NEW cov: 11775 ft: 14706 corp: 21/358b lim: 50 exec/s: 56 rss: 69Mb L: 11/33 MS: 1 ChangeByte- 00:08:14.465 [2024-04-27 06:50:44.162632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.465 [2024-04-27 06:50:44.162664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.465 [2024-04-27 06:50:44.162785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.465 [2024-04-27 06:50:44.162811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.465 #57 NEW cov: 11775 ft: 14714 corp: 22/387b lim: 50 exec/s: 57 rss: 69Mb L: 29/33 MS: 1 InsertRepeatedBytes- 00:08:14.465 [2024-04-27 06:50:44.203001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.465 [2024-04-27 06:50:44.203037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.465 [2024-04-27 06:50:44.203169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.465 [2024-04-27 06:50:44.203193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.465 [2024-04-27 06:50:44.203325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:14.465 [2024-04-27 06:50:44.203347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.465 #58 NEW cov: 11775 ft: 14738 corp: 23/421b lim: 50 exec/s: 58 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:14.465 [2024-04-27 06:50:44.242282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.465 [2024-04-27 06:50:44.242315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.465 #59 NEW cov: 11775 ft: 14741 corp: 24/432b lim: 50 exec/s: 59 rss: 69Mb L: 11/34 MS: 1 ChangeBinInt- 00:08:14.465 [2024-04-27 06:50:44.292805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.465 [2024-04-27 06:50:44.292832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.465 #60 NEW cov: 11775 ft: 14756 corp: 25/442b lim: 50 exec/s: 60 rss: 69Mb L: 10/34 MS: 1 ChangeBinInt- 00:08:14.465 [2024-04-27 06:50:44.332800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.465 [2024-04-27 06:50:44.332826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.465 [2024-04-27 06:50:44.332965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.465 [2024-04-27 06:50:44.332989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.465 #61 NEW cov: 11775 ft: 14767 corp: 26/464b lim: 50 exec/s: 61 rss: 69Mb L: 22/34 MS: 1 CrossOver- 00:08:14.725 [2024-04-27 06:50:44.383137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.725 [2024-04-27 06:50:44.383164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.725 #62 NEW cov: 11775 ft: 14778 corp: 27/479b lim: 50 exec/s: 62 rss: 69Mb L: 15/34 MS: 1 CMP- DE: "\000\000\000\014"- 00:08:14.725 [2024-04-27 06:50:44.422738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.725 [2024-04-27 06:50:44.422772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.725 #63 NEW cov: 11775 ft: 14789 corp: 28/495b lim: 50 exec/s: 63 rss: 69Mb L: 16/34 MS: 1 InsertByte- 00:08:14.725 [2024-04-27 06:50:44.473397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.725 [2024-04-27 06:50:44.473425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.725 #64 NEW cov: 11775 ft: 14807 corp: 29/512b lim: 50 exec/s: 64 rss: 69Mb L: 17/34 MS: 1 EraseBytes- 00:08:14.725 [2024-04-27 06:50:44.523982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.725 [2024-04-27 06:50:44.524019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.725 [2024-04-27 06:50:44.524148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.725 [2024-04-27 06:50:44.524173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.725 [2024-04-27 06:50:44.524309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:14.725 [2024-04-27 06:50:44.524334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.725 #65 NEW cov: 11775 ft: 14870 corp: 30/544b lim: 50 exec/s: 65 rss: 69Mb L: 32/34 MS: 1 InsertRepeatedBytes- 00:08:14.725 [2024-04-27 06:50:44.573526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.725 [2024-04-27 06:50:44.573558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.725 #66 NEW cov: 11775 ft: 14885 corp: 31/563b lim: 50 exec/s: 66 rss: 69Mb L: 19/34 MS: 1 CrossOver- 00:08:14.725 [2024-04-27 06:50:44.603751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.725 [2024-04-27 06:50:44.603783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.985 #67 NEW cov: 11775 ft: 14886 corp: 32/574b lim: 50 exec/s: 67 rss: 69Mb L: 11/34 MS: 1 ChangeBit- 00:08:14.985 [2024-04-27 06:50:44.644340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.985 [2024-04-27 06:50:44.644375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.644491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.985 [2024-04-27 06:50:44.644519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.644642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:14.985 [2024-04-27 06:50:44.644667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.644791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:14.985 [2024-04-27 06:50:44.644813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.985 #71 NEW cov: 11775 ft: 15185 corp: 33/622b lim: 50 exec/s: 71 rss: 69Mb L: 48/48 MS: 4 ShuffleBytes-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:14.985 [2024-04-27 06:50:44.684384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.985 [2024-04-27 06:50:44.684419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.684511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.985 [2024-04-27 06:50:44.684533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.684651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:14.985 [2024-04-27 06:50:44.684674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.985 #72 NEW cov: 11775 ft: 15233 corp: 34/656b lim: 50 exec/s: 72 rss: 70Mb L: 34/48 MS: 1 InsertByte- 00:08:14.985 [2024-04-27 06:50:44.734424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.985 [2024-04-27 06:50:44.734452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.734589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.985 [2024-04-27 06:50:44.734617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.985 #73 NEW cov: 11775 ft: 15267 corp: 35/678b lim: 50 exec/s: 73 rss: 70Mb L: 22/48 MS: 1 ShuffleBytes- 00:08:14.985 [2024-04-27 06:50:44.774085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.985 [2024-04-27 06:50:44.774117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.774208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.985 [2024-04-27 06:50:44.774231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.985 #74 NEW cov: 11775 ft: 15284 corp: 36/701b lim: 50 exec/s: 74 rss: 70Mb L: 23/48 MS: 1 InsertByte- 00:08:14.985 [2024-04-27 06:50:44.814771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.985 [2024-04-27 06:50:44.814805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.814916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.985 [2024-04-27 06:50:44.814940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.815057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:14.985 [2024-04-27 06:50:44.815082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.985 #75 NEW cov: 11775 ft: 15285 corp: 37/738b lim: 50 exec/s: 75 rss: 70Mb L: 37/48 MS: 1 CrossOver- 00:08:14.985 [2024-04-27 06:50:44.854811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:14.985 [2024-04-27 06:50:44.854843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.854918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:14.985 [2024-04-27 06:50:44.854945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.855063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:14.985 [2024-04-27 06:50:44.855087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.985 [2024-04-27 06:50:44.855211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:14.985 [2024-04-27 06:50:44.855236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.245 #76 NEW cov: 11775 ft: 15295 corp: 38/786b lim: 50 exec/s: 76 rss: 70Mb L: 48/48 MS: 1 CopyPart- 00:08:15.245 [2024-04-27 06:50:44.904152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:15.245 [2024-04-27 06:50:44.904179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.245 #77 NEW cov: 11775 ft: 15325 corp: 39/797b lim: 50 exec/s: 77 rss: 70Mb L: 11/48 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:15.245 [2024-04-27 06:50:44.945254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:15.245 [2024-04-27 06:50:44.945290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.245 [2024-04-27 06:50:44.945413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:15.245 [2024-04-27 06:50:44.945437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.245 [2024-04-27 06:50:44.945554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:15.245 [2024-04-27 06:50:44.945579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.245 #78 NEW cov: 11775 ft: 15336 corp: 40/831b lim: 50 exec/s: 39 rss: 70Mb L: 34/48 MS: 1 PersAutoDict- DE: "\000\000\000\014"- 00:08:15.245 #78 DONE cov: 11775 ft: 15336 corp: 40/831b lim: 50 exec/s: 39 rss: 70Mb 00:08:15.245 ###### Recommended dictionary. ###### 00:08:15.245 "\000\000\000\014" # Uses: 1 00:08:15.245 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:15.245 ###### End of recommended dictionary. ###### 00:08:15.245 Done 78 runs in 2 second(s) 00:08:15.245 06:50:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:15.245 06:50:45 -- ../common.sh@72 -- # (( i++ )) 00:08:15.245 06:50:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.245 06:50:45 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:15.245 06:50:45 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:15.245 06:50:45 -- nvmf/run.sh@24 -- # local timen=1 00:08:15.245 06:50:45 -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.245 06:50:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:15.245 06:50:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:15.245 06:50:45 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:15.245 06:50:45 -- nvmf/run.sh@29 -- # port=4422 00:08:15.245 06:50:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:15.245 06:50:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:15.245 06:50:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.246 06:50:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:15.246 [2024-04-27 06:50:45.129238] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:15.246 [2024-04-27 06:50:45.129325] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2628216 ] 00:08:15.505 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.505 [2024-04-27 06:50:45.381381] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.764 [2024-04-27 06:50:45.410883] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.764 [2024-04-27 06:50:45.411009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.764 [2024-04-27 06:50:45.462377] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.764 [2024-04-27 06:50:45.478690] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:15.764 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.764 INFO: Seed: 1126876829 00:08:15.764 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:15.764 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:15.764 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:15.764 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.764 #2 INITED exec/s: 0 rss: 59Mb 00:08:15.764 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.764 This may also happen if the target rejected all inputs we tried so far 00:08:15.764 [2024-04-27 06:50:45.534296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:15.764 [2024-04-27 06:50:45.534328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.764 [2024-04-27 06:50:45.534377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:15.764 [2024-04-27 06:50:45.534392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.764 [2024-04-27 06:50:45.534456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:15.764 [2024-04-27 06:50:45.534471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.764 [2024-04-27 06:50:45.534529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:15.764 [2024-04-27 06:50:45.534544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.023 NEW_FUNC[1/665]: 0x4c4bc0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:16.023 NEW_FUNC[2/665]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.023 #20 NEW cov: 11574 ft: 11575 corp: 2/69b lim: 85 exec/s: 0 rss: 67Mb L: 68/68 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:16.023 [2024-04-27 06:50:45.834808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.023 [2024-04-27 06:50:45.834842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.023 [2024-04-27 06:50:45.834890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.023 [2024-04-27 06:50:45.834905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.023 [2024-04-27 06:50:45.834958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.023 [2024-04-27 06:50:45.834972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.023 #22 NEW cov: 11687 ft: 12532 corp: 3/121b lim: 85 exec/s: 0 rss: 67Mb L: 52/68 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:16.023 [2024-04-27 06:50:45.874961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.023 [2024-04-27 06:50:45.874990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.023 [2024-04-27 06:50:45.875031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.023 [2024-04-27 06:50:45.875048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.023 [2024-04-27 06:50:45.875098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.023 [2024-04-27 06:50:45.875114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.023 [2024-04-27 06:50:45.875163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.023 [2024-04-27 06:50:45.875178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.023 #23 NEW cov: 11693 ft: 12719 corp: 4/189b lim: 85 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBinInt- 00:08:16.023 [2024-04-27 06:50:45.915097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.023 [2024-04-27 06:50:45.915124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.023 [2024-04-27 06:50:45.915175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.023 [2024-04-27 06:50:45.915192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.023 [2024-04-27 06:50:45.915243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.023 [2024-04-27 06:50:45.915258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.023 [2024-04-27 06:50:45.915313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.023 [2024-04-27 06:50:45.915327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.283 #29 NEW cov: 11778 ft: 12898 corp: 5/258b lim: 85 exec/s: 0 rss: 67Mb L: 69/69 MS: 1 InsertByte- 00:08:16.283 [2024-04-27 06:50:45.955089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.283 [2024-04-27 06:50:45.955117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:45.955152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.283 [2024-04-27 06:50:45.955167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:45.955221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.283 [2024-04-27 06:50:45.955235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.283 #30 NEW cov: 11778 ft: 12970 corp: 6/313b lim: 85 exec/s: 0 rss: 67Mb L: 55/69 MS: 1 InsertRepeatedBytes- 00:08:16.283 [2024-04-27 06:50:45.995174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.283 [2024-04-27 06:50:45.995200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:45.995237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.283 [2024-04-27 06:50:45.995252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:45.995303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.283 [2024-04-27 06:50:45.995318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.283 #31 NEW cov: 11778 ft: 13057 corp: 7/375b lim: 85 exec/s: 0 rss: 67Mb L: 62/69 MS: 1 InsertRepeatedBytes- 00:08:16.283 [2024-04-27 06:50:46.035316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.283 [2024-04-27 06:50:46.035343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.035380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.283 [2024-04-27 06:50:46.035400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.035452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.283 [2024-04-27 06:50:46.035469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.283 #32 NEW cov: 11778 ft: 13118 corp: 8/430b lim: 85 exec/s: 0 rss: 67Mb L: 55/69 MS: 1 ChangeBit- 00:08:16.283 [2024-04-27 06:50:46.075567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.283 [2024-04-27 06:50:46.075594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.075641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.283 [2024-04-27 06:50:46.075656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.075705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.283 [2024-04-27 06:50:46.075722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.075770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.283 [2024-04-27 06:50:46.075785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.283 #33 NEW cov: 11778 ft: 13156 corp: 9/504b lim: 85 exec/s: 0 rss: 68Mb L: 74/74 MS: 1 CrossOver- 00:08:16.283 [2024-04-27 06:50:46.115774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.283 [2024-04-27 06:50:46.115802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.115836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.283 [2024-04-27 06:50:46.115851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.115904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.283 [2024-04-27 06:50:46.115919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.115970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.283 [2024-04-27 06:50:46.115985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.283 #39 NEW cov: 11778 ft: 13173 corp: 10/580b lim: 85 exec/s: 0 rss: 68Mb L: 76/76 MS: 1 CrossOver- 00:08:16.283 [2024-04-27 06:50:46.155811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.283 [2024-04-27 06:50:46.155839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.155894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.283 [2024-04-27 06:50:46.155911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.155963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.283 [2024-04-27 06:50:46.155979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.283 [2024-04-27 06:50:46.156031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.283 [2024-04-27 06:50:46.156048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.283 #40 NEW cov: 11778 ft: 13257 corp: 11/654b lim: 85 exec/s: 0 rss: 68Mb L: 74/76 MS: 1 ChangeBit- 00:08:16.543 [2024-04-27 06:50:46.195742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.543 [2024-04-27 06:50:46.195773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.195826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.543 [2024-04-27 06:50:46.195841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.195896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.543 [2024-04-27 06:50:46.195910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.543 #41 NEW cov: 11778 ft: 13323 corp: 12/716b lim: 85 exec/s: 0 rss: 68Mb L: 62/76 MS: 1 EraseBytes- 00:08:16.543 [2024-04-27 06:50:46.235847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.543 [2024-04-27 06:50:46.235873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.235926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.543 [2024-04-27 06:50:46.235942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.235991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.543 [2024-04-27 06:50:46.236007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.543 #42 NEW cov: 11778 ft: 13390 corp: 13/778b lim: 85 exec/s: 0 rss: 68Mb L: 62/76 MS: 1 CopyPart- 00:08:16.543 [2024-04-27 06:50:46.276223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.543 [2024-04-27 06:50:46.276249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.276304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.543 [2024-04-27 06:50:46.276319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.276372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.543 [2024-04-27 06:50:46.276385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.276441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.543 [2024-04-27 06:50:46.276457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.276510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:16.543 [2024-04-27 06:50:46.276525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.543 #43 NEW cov: 11778 ft: 13448 corp: 14/863b lim: 85 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:08:16.543 [2024-04-27 06:50:46.316072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.543 [2024-04-27 06:50:46.316099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.316151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.543 [2024-04-27 06:50:46.316166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.316220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.543 [2024-04-27 06:50:46.316237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.543 #44 NEW cov: 11778 ft: 13489 corp: 15/915b lim: 85 exec/s: 0 rss: 68Mb L: 52/85 MS: 1 ChangeBit- 00:08:16.543 [2024-04-27 06:50:46.356180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.543 [2024-04-27 06:50:46.356206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.356257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.543 [2024-04-27 06:50:46.356273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.543 [2024-04-27 06:50:46.356326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.543 [2024-04-27 06:50:46.356340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.543 #45 NEW cov: 11778 ft: 13581 corp: 16/967b lim: 85 exec/s: 0 rss: 68Mb L: 52/85 MS: 1 ChangeBinInt- 00:08:16.543 [2024-04-27 06:50:46.396419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.543 [2024-04-27 06:50:46.396446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.544 [2024-04-27 06:50:46.396507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.544 [2024-04-27 06:50:46.396523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.544 [2024-04-27 06:50:46.396574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.544 [2024-04-27 06:50:46.396591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.544 [2024-04-27 06:50:46.396645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.544 [2024-04-27 06:50:46.396661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.544 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.544 #46 NEW cov: 11801 ft: 13651 corp: 17/1042b lim: 85 exec/s: 0 rss: 68Mb L: 75/85 MS: 1 InsertByte- 00:08:16.544 [2024-04-27 06:50:46.436431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.544 [2024-04-27 06:50:46.436458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.544 [2024-04-27 06:50:46.436524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.544 [2024-04-27 06:50:46.436540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.544 [2024-04-27 06:50:46.436595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.544 [2024-04-27 06:50:46.436611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.803 #47 NEW cov: 11801 ft: 13719 corp: 18/1096b lim: 85 exec/s: 0 rss: 68Mb L: 54/85 MS: 1 CopyPart- 00:08:16.803 [2024-04-27 06:50:46.476798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.803 [2024-04-27 06:50:46.476825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.803 [2024-04-27 06:50:46.476874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.803 [2024-04-27 06:50:46.476893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.803 [2024-04-27 06:50:46.476943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.803 [2024-04-27 06:50:46.476958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.803 [2024-04-27 06:50:46.477008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.803 [2024-04-27 06:50:46.477023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.803 [2024-04-27 06:50:46.477073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:16.803 [2024-04-27 06:50:46.477089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.803 #48 NEW cov: 11801 ft: 13735 corp: 19/1181b lim: 85 exec/s: 0 rss: 69Mb L: 85/85 MS: 1 ChangeBinInt- 00:08:16.803 [2024-04-27 06:50:46.516829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.803 [2024-04-27 06:50:46.516856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.803 [2024-04-27 06:50:46.516915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.803 [2024-04-27 06:50:46.516931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.803 [2024-04-27 06:50:46.516982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.803 [2024-04-27 06:50:46.516996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.803 [2024-04-27 06:50:46.517048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.803 [2024-04-27 06:50:46.517063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.803 #49 NEW cov: 11801 ft: 13750 corp: 20/1250b lim: 85 exec/s: 49 rss: 69Mb L: 69/85 MS: 1 ChangeByte- 00:08:16.803 [2024-04-27 06:50:46.556915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.803 [2024-04-27 06:50:46.556942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.803 [2024-04-27 06:50:46.556980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.804 [2024-04-27 06:50:46.556997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.557050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.804 [2024-04-27 06:50:46.557066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.557119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.804 [2024-04-27 06:50:46.557135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.804 #50 NEW cov: 11801 ft: 13767 corp: 21/1321b lim: 85 exec/s: 50 rss: 69Mb L: 71/85 MS: 1 CopyPart- 00:08:16.804 [2024-04-27 06:50:46.596832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.804 [2024-04-27 06:50:46.596859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.596910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.804 [2024-04-27 06:50:46.596926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.596977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.804 [2024-04-27 06:50:46.596992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.804 #53 NEW cov: 11801 ft: 13792 corp: 22/1376b lim: 85 exec/s: 53 rss: 69Mb L: 55/85 MS: 3 CrossOver-ShuffleBytes-CrossOver- 00:08:16.804 [2024-04-27 06:50:46.637147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.804 [2024-04-27 06:50:46.637175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.637212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.804 [2024-04-27 06:50:46.637227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.637278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.804 [2024-04-27 06:50:46.637293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.637344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.804 [2024-04-27 06:50:46.637358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.804 #54 NEW cov: 11801 ft: 13798 corp: 23/1452b lim: 85 exec/s: 54 rss: 69Mb L: 76/85 MS: 1 InsertRepeatedBytes- 00:08:16.804 [2024-04-27 06:50:46.677269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:16.804 [2024-04-27 06:50:46.677297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.677343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:16.804 [2024-04-27 06:50:46.677359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.677429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:16.804 [2024-04-27 06:50:46.677447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.804 [2024-04-27 06:50:46.677501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:16.804 [2024-04-27 06:50:46.677516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.804 #55 NEW cov: 11801 ft: 13808 corp: 24/1520b lim: 85 exec/s: 55 rss: 69Mb L: 68/85 MS: 1 ChangeBit- 00:08:17.063 [2024-04-27 06:50:46.717376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.063 [2024-04-27 06:50:46.717408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.063 [2024-04-27 06:50:46.717462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.063 [2024-04-27 06:50:46.717479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.063 [2024-04-27 06:50:46.717542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.063 [2024-04-27 06:50:46.717557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.063 [2024-04-27 06:50:46.717610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:17.063 [2024-04-27 06:50:46.717626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.063 #56 NEW cov: 11801 ft: 13812 corp: 25/1588b lim: 85 exec/s: 56 rss: 69Mb L: 68/85 MS: 1 CopyPart- 00:08:17.063 [2024-04-27 06:50:46.757320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.063 [2024-04-27 06:50:46.757348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.063 [2024-04-27 06:50:46.757383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.063 [2024-04-27 06:50:46.757401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.757453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.064 [2024-04-27 06:50:46.757469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.064 #57 NEW cov: 11801 ft: 13836 corp: 26/1646b lim: 85 exec/s: 57 rss: 69Mb L: 58/85 MS: 1 CrossOver- 00:08:17.064 [2024-04-27 06:50:46.797754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.064 [2024-04-27 06:50:46.797782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.797830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.064 [2024-04-27 06:50:46.797845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.797895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.064 [2024-04-27 06:50:46.797910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.797962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:17.064 [2024-04-27 06:50:46.797976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.798027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:17.064 [2024-04-27 06:50:46.798042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.064 #58 NEW cov: 11801 ft: 13864 corp: 27/1731b lim: 85 exec/s: 58 rss: 69Mb L: 85/85 MS: 1 CopyPart- 00:08:17.064 [2024-04-27 06:50:46.837572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.064 [2024-04-27 06:50:46.837599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.837635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.064 [2024-04-27 06:50:46.837649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.837702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.064 [2024-04-27 06:50:46.837717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.064 #59 NEW cov: 11801 ft: 13883 corp: 28/1792b lim: 85 exec/s: 59 rss: 69Mb L: 61/85 MS: 1 CrossOver- 00:08:17.064 [2024-04-27 06:50:46.877705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.064 [2024-04-27 06:50:46.877734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.877775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.064 [2024-04-27 06:50:46.877790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.877842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.064 [2024-04-27 06:50:46.877857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.064 #60 NEW cov: 11801 ft: 13899 corp: 29/1855b lim: 85 exec/s: 60 rss: 69Mb L: 63/85 MS: 1 InsertByte- 00:08:17.064 [2024-04-27 06:50:46.917796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.064 [2024-04-27 06:50:46.917822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.917871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.064 [2024-04-27 06:50:46.917887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.917937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.064 [2024-04-27 06:50:46.917954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.064 #62 NEW cov: 11801 ft: 13966 corp: 30/1907b lim: 85 exec/s: 62 rss: 69Mb L: 52/85 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:17.064 [2024-04-27 06:50:46.957918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.064 [2024-04-27 06:50:46.957944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.957996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.064 [2024-04-27 06:50:46.958012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.064 [2024-04-27 06:50:46.958064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.064 [2024-04-27 06:50:46.958078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.323 #63 NEW cov: 11801 ft: 14087 corp: 31/1959b lim: 85 exec/s: 63 rss: 69Mb L: 52/85 MS: 1 ChangeByte- 00:08:17.323 [2024-04-27 06:50:46.998186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.323 [2024-04-27 06:50:46.998214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.323 [2024-04-27 06:50:46.998256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.323 [2024-04-27 06:50:46.998271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.323 [2024-04-27 06:50:46.998322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.323 [2024-04-27 06:50:46.998336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.323 [2024-04-27 06:50:46.998389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:17.323 [2024-04-27 06:50:46.998408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.323 #64 NEW cov: 11801 ft: 14109 corp: 32/2043b lim: 85 exec/s: 64 rss: 69Mb L: 84/85 MS: 1 InsertRepeatedBytes- 00:08:17.323 [2024-04-27 06:50:47.038209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.323 [2024-04-27 06:50:47.038235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.323 [2024-04-27 06:50:47.038267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.323 [2024-04-27 06:50:47.038282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.038335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.324 [2024-04-27 06:50:47.038350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.324 #65 NEW cov: 11801 ft: 14114 corp: 33/2105b lim: 85 exec/s: 65 rss: 69Mb L: 62/85 MS: 1 ChangeByte- 00:08:17.324 [2024-04-27 06:50:47.078307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.324 [2024-04-27 06:50:47.078335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.078368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.324 [2024-04-27 06:50:47.078382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.078442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.324 [2024-04-27 06:50:47.078456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.324 #66 NEW cov: 11801 ft: 14133 corp: 34/2168b lim: 85 exec/s: 66 rss: 69Mb L: 63/85 MS: 1 ChangeByte- 00:08:17.324 [2024-04-27 06:50:47.118422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.324 [2024-04-27 06:50:47.118451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.118507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.324 [2024-04-27 06:50:47.118523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.118577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.324 [2024-04-27 06:50:47.118593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.324 #67 NEW cov: 11801 ft: 14142 corp: 35/2222b lim: 85 exec/s: 67 rss: 69Mb L: 54/85 MS: 1 CrossOver- 00:08:17.324 [2024-04-27 06:50:47.158518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.324 [2024-04-27 06:50:47.158545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.158585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.324 [2024-04-27 06:50:47.158600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.158652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.324 [2024-04-27 06:50:47.158668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.324 #68 NEW cov: 11801 ft: 14154 corp: 36/2284b lim: 85 exec/s: 68 rss: 70Mb L: 62/85 MS: 1 ChangeBinInt- 00:08:17.324 [2024-04-27 06:50:47.198612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.324 [2024-04-27 06:50:47.198639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.198675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.324 [2024-04-27 06:50:47.198690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.324 [2024-04-27 06:50:47.198743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.324 [2024-04-27 06:50:47.198758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.583 #69 NEW cov: 11801 ft: 14157 corp: 37/2338b lim: 85 exec/s: 69 rss: 70Mb L: 54/85 MS: 1 ChangeBit- 00:08:17.583 [2024-04-27 06:50:47.238904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.583 [2024-04-27 06:50:47.238932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.238972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.583 [2024-04-27 06:50:47.238988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.239040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.583 [2024-04-27 06:50:47.239056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.239108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:17.583 [2024-04-27 06:50:47.239125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.583 #70 NEW cov: 11801 ft: 14167 corp: 38/2422b lim: 85 exec/s: 70 rss: 70Mb L: 84/85 MS: 1 ChangeBinInt- 00:08:17.583 [2024-04-27 06:50:47.278829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.583 [2024-04-27 06:50:47.278856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.278893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.583 [2024-04-27 06:50:47.278909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.278961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.583 [2024-04-27 06:50:47.278976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.583 #71 NEW cov: 11801 ft: 14194 corp: 39/2484b lim: 85 exec/s: 71 rss: 70Mb L: 62/85 MS: 1 ChangeBit- 00:08:17.583 [2024-04-27 06:50:47.318921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.583 [2024-04-27 06:50:47.318948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.318983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.583 [2024-04-27 06:50:47.318998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.319053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.583 [2024-04-27 06:50:47.319068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.583 #72 NEW cov: 11801 ft: 14215 corp: 40/2547b lim: 85 exec/s: 72 rss: 70Mb L: 63/85 MS: 1 ChangeByte- 00:08:17.583 [2024-04-27 06:50:47.358921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.583 [2024-04-27 06:50:47.358948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.359002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.583 [2024-04-27 06:50:47.359016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.583 #75 NEW cov: 11801 ft: 14534 corp: 41/2592b lim: 85 exec/s: 75 rss: 70Mb L: 45/85 MS: 3 ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:17.583 [2024-04-27 06:50:47.399329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.583 [2024-04-27 06:50:47.399358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.399399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.583 [2024-04-27 06:50:47.399415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.399469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.583 [2024-04-27 06:50:47.399483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.399536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:17.583 [2024-04-27 06:50:47.399552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.583 #76 NEW cov: 11801 ft: 14541 corp: 42/2660b lim: 85 exec/s: 76 rss: 70Mb L: 68/85 MS: 1 ChangeBit- 00:08:17.583 [2024-04-27 06:50:47.439258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.583 [2024-04-27 06:50:47.439286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.439322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.583 [2024-04-27 06:50:47.439337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.583 [2024-04-27 06:50:47.439390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.583 [2024-04-27 06:50:47.439411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.583 #77 NEW cov: 11801 ft: 14547 corp: 43/2714b lim: 85 exec/s: 77 rss: 70Mb L: 54/85 MS: 1 ChangeBit- 00:08:17.842 [2024-04-27 06:50:47.479415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.842 [2024-04-27 06:50:47.479443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.842 [2024-04-27 06:50:47.479481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.842 [2024-04-27 06:50:47.479497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.842 [2024-04-27 06:50:47.479550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.842 [2024-04-27 06:50:47.479566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.842 #78 NEW cov: 11801 ft: 14550 corp: 44/2768b lim: 85 exec/s: 78 rss: 70Mb L: 54/85 MS: 1 CopyPart- 00:08:17.842 [2024-04-27 06:50:47.519509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:17.842 [2024-04-27 06:50:47.519538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.842 [2024-04-27 06:50:47.519573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:17.842 [2024-04-27 06:50:47.519590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.842 [2024-04-27 06:50:47.519660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:17.842 [2024-04-27 06:50:47.519677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.842 #79 NEW cov: 11801 ft: 14555 corp: 45/2822b lim: 85 exec/s: 39 rss: 70Mb L: 54/85 MS: 1 ChangeBinInt- 00:08:17.842 #79 DONE cov: 11801 ft: 14555 corp: 45/2822b lim: 85 exec/s: 39 rss: 70Mb 00:08:17.842 Done 79 runs in 2 second(s) 00:08:17.842 06:50:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:17.842 06:50:47 -- ../common.sh@72 -- # (( i++ )) 00:08:17.842 06:50:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.842 06:50:47 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:17.842 06:50:47 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:17.842 06:50:47 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.842 06:50:47 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.842 06:50:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:17.842 06:50:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:17.842 06:50:47 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:17.842 06:50:47 -- nvmf/run.sh@29 -- # port=4423 00:08:17.842 06:50:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:17.842 06:50:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:17.842 06:50:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.842 06:50:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:17.843 [2024-04-27 06:50:47.693565] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:17.843 [2024-04-27 06:50:47.693661] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2628753 ] 00:08:17.843 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.101 [2024-04-27 06:50:47.869060] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.101 [2024-04-27 06:50:47.888603] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:18.101 [2024-04-27 06:50:47.888724] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.101 [2024-04-27 06:50:47.940128] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.101 [2024-04-27 06:50:47.956458] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:18.101 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.101 INFO: Seed: 3604865032 00:08:18.101 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:18.101 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:18.101 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:18.101 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.101 #2 INITED exec/s: 0 rss: 59Mb 00:08:18.101 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.101 This may also happen if the target rejected all inputs we tried so far 00:08:18.360 [2024-04-27 06:50:48.001113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.360 [2024-04-27 06:50:48.001150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.360 [2024-04-27 06:50:48.001186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.360 [2024-04-27 06:50:48.001203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.360 [2024-04-27 06:50:48.001232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.360 [2024-04-27 06:50:48.001249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.619 NEW_FUNC[1/659]: 0x4c7df0 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:18.619 NEW_FUNC[2/659]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.619 #4 NEW cov: 11481 ft: 11484 corp: 2/20b lim: 25 exec/s: 0 rss: 66Mb L: 19/19 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:18.619 [2024-04-27 06:50:48.321881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.619 [2024-04-27 06:50:48.321919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.321954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.619 [2024-04-27 06:50:48.321972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.322002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.619 [2024-04-27 06:50:48.322018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.619 NEW_FUNC[1/5]: 0xf1ee00 in rte_get_tsc_cycles /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/rte_cycles.h:61 00:08:18.619 NEW_FUNC[2/5]: 0xf1ee60 in rte_rdtsc /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/rte_cycles.h:31 00:08:18.619 #5 NEW cov: 11620 ft: 11907 corp: 3/39b lim: 25 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 ChangeBit- 00:08:18.619 [2024-04-27 06:50:48.391970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.619 [2024-04-27 06:50:48.392000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.392046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.619 [2024-04-27 06:50:48.392064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.392095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.619 [2024-04-27 06:50:48.392111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.392139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:18.619 [2024-04-27 06:50:48.392155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.619 #6 NEW cov: 11626 ft: 12650 corp: 4/63b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:18.619 [2024-04-27 06:50:48.442024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.619 [2024-04-27 06:50:48.442056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.442104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.619 [2024-04-27 06:50:48.442122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.442152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.619 [2024-04-27 06:50:48.442168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.619 #12 NEW cov: 11711 ft: 13015 corp: 5/82b lim: 25 exec/s: 0 rss: 67Mb L: 19/24 MS: 1 ChangeByte- 00:08:18.619 [2024-04-27 06:50:48.502202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.619 [2024-04-27 06:50:48.502231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.502277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.619 [2024-04-27 06:50:48.502295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.619 [2024-04-27 06:50:48.502325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.619 [2024-04-27 06:50:48.502342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.879 #13 NEW cov: 11711 ft: 13141 corp: 6/101b lim: 25 exec/s: 0 rss: 67Mb L: 19/24 MS: 1 CopyPart- 00:08:18.879 [2024-04-27 06:50:48.562353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.879 [2024-04-27 06:50:48.562382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.562436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.879 [2024-04-27 06:50:48.562454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.562486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.879 [2024-04-27 06:50:48.562502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.879 #14 NEW cov: 11711 ft: 13266 corp: 7/120b lim: 25 exec/s: 0 rss: 67Mb L: 19/24 MS: 1 CopyPart- 00:08:18.879 [2024-04-27 06:50:48.612463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.879 [2024-04-27 06:50:48.612492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.612540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.879 [2024-04-27 06:50:48.612557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.612588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.879 [2024-04-27 06:50:48.612604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.879 #15 NEW cov: 11711 ft: 13304 corp: 8/139b lim: 25 exec/s: 0 rss: 67Mb L: 19/24 MS: 1 CopyPart- 00:08:18.879 [2024-04-27 06:50:48.662679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.879 [2024-04-27 06:50:48.662709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.662764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.879 [2024-04-27 06:50:48.662783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.662813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.879 [2024-04-27 06:50:48.662830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.662858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:18.879 [2024-04-27 06:50:48.662874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.662903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:18.879 [2024-04-27 06:50:48.662918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:18.879 #16 NEW cov: 11711 ft: 13470 corp: 9/164b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CopyPart- 00:08:18.879 [2024-04-27 06:50:48.722852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:18.879 [2024-04-27 06:50:48.722882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.722929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:18.879 [2024-04-27 06:50:48.722946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.722977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:18.879 [2024-04-27 06:50:48.722993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.879 [2024-04-27 06:50:48.723021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:18.879 [2024-04-27 06:50:48.723037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.879 #17 NEW cov: 11711 ft: 13566 corp: 10/187b lim: 25 exec/s: 0 rss: 69Mb L: 23/25 MS: 1 CMP- DE: "8\000\000\000"- 00:08:19.138 [2024-04-27 06:50:48.782984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.138 [2024-04-27 06:50:48.783014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.138 [2024-04-27 06:50:48.783046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.138 [2024-04-27 06:50:48.783079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.138 [2024-04-27 06:50:48.783110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.138 [2024-04-27 06:50:48.783125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.138 [2024-04-27 06:50:48.783155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.138 [2024-04-27 06:50:48.783172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.138 #18 NEW cov: 11711 ft: 13631 corp: 11/210b lim: 25 exec/s: 0 rss: 69Mb L: 23/25 MS: 1 PersAutoDict- DE: "8\000\000\000"- 00:08:19.138 [2024-04-27 06:50:48.843128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.138 [2024-04-27 06:50:48.843158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.138 [2024-04-27 06:50:48.843210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.138 [2024-04-27 06:50:48.843227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.138 [2024-04-27 06:50:48.843257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.138 [2024-04-27 06:50:48.843273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.138 [2024-04-27 06:50:48.843301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.138 [2024-04-27 06:50:48.843317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.138 #19 NEW cov: 11711 ft: 13648 corp: 12/230b lim: 25 exec/s: 0 rss: 69Mb L: 20/25 MS: 1 InsertByte- 00:08:19.139 [2024-04-27 06:50:48.893226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.139 [2024-04-27 06:50:48.893256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:48.893304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.139 [2024-04-27 06:50:48.893321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:48.893350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.139 [2024-04-27 06:50:48.893366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:48.893402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.139 [2024-04-27 06:50:48.893418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.139 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.139 #20 NEW cov: 11728 ft: 13707 corp: 13/250b lim: 25 exec/s: 0 rss: 69Mb L: 20/25 MS: 1 InsertByte- 00:08:19.139 [2024-04-27 06:50:48.953423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.139 [2024-04-27 06:50:48.953454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:48.953502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.139 [2024-04-27 06:50:48.953520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:48.953550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.139 [2024-04-27 06:50:48.953566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:48.953604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.139 [2024-04-27 06:50:48.953620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.139 #21 NEW cov: 11728 ft: 13725 corp: 14/273b lim: 25 exec/s: 21 rss: 69Mb L: 23/25 MS: 1 ShuffleBytes- 00:08:19.139 [2024-04-27 06:50:49.013606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.139 [2024-04-27 06:50:49.013636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:49.013669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.139 [2024-04-27 06:50:49.013691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:49.013722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.139 [2024-04-27 06:50:49.013738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.139 [2024-04-27 06:50:49.013766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.139 [2024-04-27 06:50:49.013782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.398 #22 NEW cov: 11728 ft: 13732 corp: 15/296b lim: 25 exec/s: 22 rss: 69Mb L: 23/25 MS: 1 CrossOver- 00:08:19.398 [2024-04-27 06:50:49.063746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.398 [2024-04-27 06:50:49.063777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.063826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.398 [2024-04-27 06:50:49.063844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.063874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.398 [2024-04-27 06:50:49.063891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.063920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.398 [2024-04-27 06:50:49.063936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.398 #23 NEW cov: 11728 ft: 13769 corp: 16/320b lim: 25 exec/s: 23 rss: 69Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:19.398 [2024-04-27 06:50:49.123821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.398 [2024-04-27 06:50:49.123850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.123898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.398 [2024-04-27 06:50:49.123915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.123945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.398 [2024-04-27 06:50:49.123961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.398 #24 NEW cov: 11728 ft: 13806 corp: 17/339b lim: 25 exec/s: 24 rss: 69Mb L: 19/25 MS: 1 PersAutoDict- DE: "8\000\000\000"- 00:08:19.398 [2024-04-27 06:50:49.174035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.398 [2024-04-27 06:50:49.174065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.174112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.398 [2024-04-27 06:50:49.174129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.174160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.398 [2024-04-27 06:50:49.174176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.174203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.398 [2024-04-27 06:50:49.174224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.174252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:19.398 [2024-04-27 06:50:49.174268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.398 #25 NEW cov: 11728 ft: 13829 corp: 18/364b lim: 25 exec/s: 25 rss: 69Mb L: 25/25 MS: 1 CrossOver- 00:08:19.398 [2024-04-27 06:50:49.224049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.398 [2024-04-27 06:50:49.224077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.224125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.398 [2024-04-27 06:50:49.224142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.224172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.398 [2024-04-27 06:50:49.224188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.398 #26 NEW cov: 11728 ft: 13847 corp: 19/380b lim: 25 exec/s: 26 rss: 69Mb L: 16/25 MS: 1 EraseBytes- 00:08:19.398 [2024-04-27 06:50:49.274182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.398 [2024-04-27 06:50:49.274210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.274257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.398 [2024-04-27 06:50:49.274274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.398 [2024-04-27 06:50:49.274304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.398 [2024-04-27 06:50:49.274321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.658 #27 NEW cov: 11728 ft: 13867 corp: 20/396b lim: 25 exec/s: 27 rss: 69Mb L: 16/25 MS: 1 ChangeASCIIInt- 00:08:19.658 [2024-04-27 06:50:49.334419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.658 [2024-04-27 06:50:49.334447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.334494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.658 [2024-04-27 06:50:49.334511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.334541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.658 [2024-04-27 06:50:49.334556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.334585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.658 [2024-04-27 06:50:49.334601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.658 #28 NEW cov: 11728 ft: 13876 corp: 21/416b lim: 25 exec/s: 28 rss: 69Mb L: 20/25 MS: 1 CMP- DE: "\015\000\000\000"- 00:08:19.658 [2024-04-27 06:50:49.394531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.658 [2024-04-27 06:50:49.394559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.394611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.658 [2024-04-27 06:50:49.394628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.394658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.658 [2024-04-27 06:50:49.394673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.658 #29 NEW cov: 11728 ft: 13905 corp: 22/432b lim: 25 exec/s: 29 rss: 70Mb L: 16/25 MS: 1 CMP- DE: "\263)\012\002\000\000\000\000"- 00:08:19.658 [2024-04-27 06:50:49.454757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.658 [2024-04-27 06:50:49.454786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.454818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.658 [2024-04-27 06:50:49.454835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.454864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.658 [2024-04-27 06:50:49.454880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.454908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.658 [2024-04-27 06:50:49.454924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.658 #30 NEW cov: 11728 ft: 14009 corp: 23/456b lim: 25 exec/s: 30 rss: 70Mb L: 24/25 MS: 1 CMP- DE: "\377\377\377\354"- 00:08:19.658 [2024-04-27 06:50:49.514867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.658 [2024-04-27 06:50:49.514897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.514929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.658 [2024-04-27 06:50:49.514946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.658 [2024-04-27 06:50:49.514976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.658 [2024-04-27 06:50:49.514992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.658 #31 NEW cov: 11728 ft: 14037 corp: 24/475b lim: 25 exec/s: 31 rss: 70Mb L: 19/25 MS: 1 ChangeBinInt- 00:08:19.918 [2024-04-27 06:50:49.565030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.918 [2024-04-27 06:50:49.565059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.565105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.918 [2024-04-27 06:50:49.565122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.565152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.918 [2024-04-27 06:50:49.565168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.565196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.918 [2024-04-27 06:50:49.565216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.565244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:19.918 [2024-04-27 06:50:49.565260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.918 #32 NEW cov: 11728 ft: 14045 corp: 25/500b lim: 25 exec/s: 32 rss: 70Mb L: 25/25 MS: 1 InsertByte- 00:08:19.918 [2024-04-27 06:50:49.625174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.918 [2024-04-27 06:50:49.625202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.625249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.918 [2024-04-27 06:50:49.625266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.625296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.918 [2024-04-27 06:50:49.625312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.625340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.918 [2024-04-27 06:50:49.625356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.918 #33 NEW cov: 11728 ft: 14063 corp: 26/524b lim: 25 exec/s: 33 rss: 70Mb L: 24/25 MS: 1 ChangeASCIIInt- 00:08:19.918 [2024-04-27 06:50:49.685284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.918 [2024-04-27 06:50:49.685312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.685360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.918 [2024-04-27 06:50:49.685377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.918 [2024-04-27 06:50:49.685415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.919 [2024-04-27 06:50:49.685432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.919 #34 NEW cov: 11728 ft: 14081 corp: 27/539b lim: 25 exec/s: 34 rss: 70Mb L: 15/25 MS: 1 EraseBytes- 00:08:19.919 [2024-04-27 06:50:49.745523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.919 [2024-04-27 06:50:49.745554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.919 [2024-04-27 06:50:49.745603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.919 [2024-04-27 06:50:49.745621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.919 [2024-04-27 06:50:49.745651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.919 [2024-04-27 06:50:49.745668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.919 [2024-04-27 06:50:49.745697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.919 [2024-04-27 06:50:49.745713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.919 #35 NEW cov: 11728 ft: 14088 corp: 28/559b lim: 25 exec/s: 35 rss: 70Mb L: 20/25 MS: 1 ShuffleBytes- 00:08:19.919 [2024-04-27 06:50:49.795596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:19.919 [2024-04-27 06:50:49.795625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.919 [2024-04-27 06:50:49.795672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:19.919 [2024-04-27 06:50:49.795690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.919 [2024-04-27 06:50:49.795719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:19.919 [2024-04-27 06:50:49.795735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.919 [2024-04-27 06:50:49.795763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:19.919 [2024-04-27 06:50:49.795779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.179 #36 NEW cov: 11728 ft: 14121 corp: 29/579b lim: 25 exec/s: 36 rss: 70Mb L: 20/25 MS: 1 ChangeBit- 00:08:20.179 [2024-04-27 06:50:49.845675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:20.179 [2024-04-27 06:50:49.845703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.845751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:20.179 [2024-04-27 06:50:49.845769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.845798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:20.179 [2024-04-27 06:50:49.845814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.179 #37 NEW cov: 11728 ft: 14141 corp: 30/598b lim: 25 exec/s: 37 rss: 70Mb L: 19/25 MS: 1 ChangeBinInt- 00:08:20.179 [2024-04-27 06:50:49.895855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:20.179 [2024-04-27 06:50:49.895883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.895930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:20.179 [2024-04-27 06:50:49.895947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.895976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:20.179 [2024-04-27 06:50:49.895992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.896020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:20.179 [2024-04-27 06:50:49.896036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.179 #38 NEW cov: 11735 ft: 14161 corp: 31/618b lim: 25 exec/s: 38 rss: 70Mb L: 20/25 MS: 1 ShuffleBytes- 00:08:20.179 [2024-04-27 06:50:49.945935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:20.179 [2024-04-27 06:50:49.945964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.946011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:20.179 [2024-04-27 06:50:49.946028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.946063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:20.179 [2024-04-27 06:50:49.946079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.179 #39 NEW cov: 11735 ft: 14180 corp: 32/634b lim: 25 exec/s: 39 rss: 70Mb L: 16/25 MS: 1 CopyPart- 00:08:20.179 [2024-04-27 06:50:49.996109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:20.179 [2024-04-27 06:50:49.996137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.996184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:20.179 [2024-04-27 06:50:49.996201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.996231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:20.179 [2024-04-27 06:50:49.996247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.179 [2024-04-27 06:50:49.996275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:20.179 [2024-04-27 06:50:49.996290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.179 #40 NEW cov: 11735 ft: 14210 corp: 33/654b lim: 25 exec/s: 20 rss: 70Mb L: 20/25 MS: 1 ChangeByte- 00:08:20.179 #40 DONE cov: 11735 ft: 14210 corp: 33/654b lim: 25 exec/s: 20 rss: 70Mb 00:08:20.179 ###### Recommended dictionary. ###### 00:08:20.179 "8\000\000\000" # Uses: 2 00:08:20.179 "\015\000\000\000" # Uses: 0 00:08:20.179 "\263)\012\002\000\000\000\000" # Uses: 0 00:08:20.179 "\377\377\377\354" # Uses: 0 00:08:20.179 ###### End of recommended dictionary. ###### 00:08:20.179 Done 40 runs in 2 second(s) 00:08:20.439 06:50:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:20.439 06:50:50 -- ../common.sh@72 -- # (( i++ )) 00:08:20.439 06:50:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.439 06:50:50 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:20.439 06:50:50 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:20.439 06:50:50 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.439 06:50:50 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.439 06:50:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:20.439 06:50:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:20.439 06:50:50 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:20.439 06:50:50 -- nvmf/run.sh@29 -- # port=4424 00:08:20.439 06:50:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:20.439 06:50:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:20.439 06:50:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.439 06:50:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:20.439 [2024-04-27 06:50:50.202373] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:20.439 [2024-04-27 06:50:50.202451] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2629049 ] 00:08:20.439 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.698 [2024-04-27 06:50:50.389373] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.698 [2024-04-27 06:50:50.410044] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.698 [2024-04-27 06:50:50.410187] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.698 [2024-04-27 06:50:50.461798] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.698 [2024-04-27 06:50:50.478119] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:20.698 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.698 INFO: Seed: 1830899833 00:08:20.698 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:20.698 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:20.698 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:20.698 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.698 #2 INITED exec/s: 0 rss: 59Mb 00:08:20.698 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.698 This may also happen if the target rejected all inputs we tried so far 00:08:20.698 [2024-04-27 06:50:50.554089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.698 [2024-04-27 06:50:50.554126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.698 [2024-04-27 06:50:50.554249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.698 [2024-04-27 06:50:50.554269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.267 NEW_FUNC[1/664]: 0x4c8ed0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:21.267 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.267 #12 NEW cov: 11572 ft: 11573 corp: 2/48b lim: 100 exec/s: 0 rss: 67Mb L: 47/47 MS: 5 ChangeBinInt-CrossOver-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:21.267 [2024-04-27 06:50:50.895543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:50.895598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.267 [2024-04-27 06:50:50.895753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:50.895788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.267 [2024-04-27 06:50:50.895928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:50.895957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.267 NEW_FUNC[1/1]: 0x17796b0 in nvme_tcp_qpair /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:171 00:08:21.267 #15 NEW cov: 11692 ft: 12444 corp: 3/114b lim: 100 exec/s: 0 rss: 67Mb L: 66/66 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:21.267 [2024-04-27 06:50:50.945279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:50.945305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.267 [2024-04-27 06:50:50.945442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:50.945463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.267 #16 NEW cov: 11698 ft: 12692 corp: 4/161b lim: 100 exec/s: 0 rss: 67Mb L: 47/66 MS: 1 ChangeByte- 00:08:21.267 [2024-04-27 06:50:51.005579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:51.005606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.267 [2024-04-27 06:50:51.005757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251626288887735 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:51.005783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.267 #17 NEW cov: 11783 ft: 13020 corp: 5/208b lim: 100 exec/s: 0 rss: 67Mb L: 47/66 MS: 1 ChangeBinInt- 00:08:21.267 [2024-04-27 06:50:51.055878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:51.055906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.267 [2024-04-27 06:50:51.056007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:51.056028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.267 [2024-04-27 06:50:51.056157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.267 [2024-04-27 06:50:51.056182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.267 #18 NEW cov: 11783 ft: 13226 corp: 6/274b lim: 100 exec/s: 0 rss: 67Mb L: 66/66 MS: 1 ChangeBinInt- 00:08:21.268 [2024-04-27 06:50:51.116098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.268 [2024-04-27 06:50:51.116125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.268 [2024-04-27 06:50:51.116238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4195730026713823159 len:14907 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.268 [2024-04-27 06:50:51.116266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.268 [2024-04-27 06:50:51.116400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4195730024608447034 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.268 [2024-04-27 06:50:51.116434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.268 #19 NEW cov: 11783 ft: 13324 corp: 7/345b lim: 100 exec/s: 0 rss: 67Mb L: 71/71 MS: 1 InsertRepeatedBytes- 00:08:21.526 [2024-04-27 06:50:51.166214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.166250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 [2024-04-27 06:50:51.166379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.166406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 [2024-04-27 06:50:51.166544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.166571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.526 #20 NEW cov: 11783 ft: 13361 corp: 8/417b lim: 100 exec/s: 0 rss: 67Mb L: 72/72 MS: 1 CopyPart- 00:08:21.526 [2024-04-27 06:50:51.216618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:100663296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.216656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 [2024-04-27 06:50:51.216764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.216788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 [2024-04-27 06:50:51.216915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4195730024608447034 len:14907 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.216938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.526 [2024-04-27 06:50:51.217068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.217092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.526 #26 NEW cov: 11783 ft: 13798 corp: 9/500b lim: 100 exec/s: 0 rss: 68Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:21.526 [2024-04-27 06:50:51.276316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:46849 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.276351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 [2024-04-27 06:50:51.276497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13186539712023082935 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.276525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 #32 NEW cov: 11783 ft: 13823 corp: 10/558b lim: 100 exec/s: 0 rss: 68Mb L: 58/83 MS: 1 CopyPart- 00:08:21.526 [2024-04-27 06:50:51.327068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.327102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 [2024-04-27 06:50:51.327198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-04-27 06:50:51.327221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 [2024-04-27 06:50:51.327349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13186539712023082935 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.527 [2024-04-27 06:50:51.327372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.527 [2024-04-27 06:50:51.327503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.527 [2024-04-27 06:50:51.327530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.527 #33 NEW cov: 11783 ft: 13886 corp: 11/650b lim: 100 exec/s: 0 rss: 68Mb L: 92/92 MS: 1 CrossOver- 00:08:21.527 [2024-04-27 06:50:51.387259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:100663296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.527 [2024-04-27 06:50:51.387290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.527 [2024-04-27 06:50:51.387419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251626285807543 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.527 [2024-04-27 06:50:51.387456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.527 [2024-04-27 06:50:51.387585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.527 [2024-04-27 06:50:51.387605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.527 [2024-04-27 06:50:51.387734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4195730024608447034 len:14907 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.527 [2024-04-27 06:50:51.387758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.527 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.527 #34 NEW cov: 11806 ft: 14078 corp: 12/740b lim: 100 exec/s: 0 rss: 68Mb L: 90/92 MS: 1 CrossOver- 00:08:21.785 [2024-04-27 06:50:51.446830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.446863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.785 [2024-04-27 06:50:51.446985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.447010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.785 #35 NEW cov: 11806 ft: 14091 corp: 13/788b lim: 100 exec/s: 0 rss: 68Mb L: 48/92 MS: 1 InsertByte- 00:08:21.785 [2024-04-27 06:50:51.497271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.497305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.785 [2024-04-27 06:50:51.497437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.497459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.785 [2024-04-27 06:50:51.497588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13238251629368031159 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.497613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.785 #36 NEW cov: 11806 ft: 14102 corp: 14/856b lim: 100 exec/s: 0 rss: 68Mb L: 68/92 MS: 1 CopyPart- 00:08:21.785 [2024-04-27 06:50:51.546917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.546947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.785 #37 NEW cov: 11806 ft: 14931 corp: 15/894b lim: 100 exec/s: 37 rss: 68Mb L: 38/92 MS: 1 CrossOver- 00:08:21.785 [2024-04-27 06:50:51.597367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.597413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.785 [2024-04-27 06:50:51.597569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.597592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.785 #38 NEW cov: 11806 ft: 14980 corp: 16/948b lim: 100 exec/s: 38 rss: 68Mb L: 54/92 MS: 1 EraseBytes- 00:08:21.785 [2024-04-27 06:50:51.647485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:46849 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.647519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.785 [2024-04-27 06:50:51.647645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238049632761132983 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.785 [2024-04-27 06:50:51.647668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.785 #39 NEW cov: 11806 ft: 15006 corp: 17/1007b lim: 100 exec/s: 39 rss: 68Mb L: 59/92 MS: 1 InsertByte- 00:08:22.045 [2024-04-27 06:50:51.697735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.697770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.045 [2024-04-27 06:50:51.697892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.697918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.045 #40 NEW cov: 11806 ft: 15012 corp: 18/1049b lim: 100 exec/s: 40 rss: 69Mb L: 42/92 MS: 1 CrossOver- 00:08:22.045 [2024-04-27 06:50:51.747629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253445119 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.747664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.045 #43 NEW cov: 11806 ft: 15025 corp: 19/1070b lim: 100 exec/s: 43 rss: 69Mb L: 21/92 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:22.045 [2024-04-27 06:50:51.798039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.798074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.045 [2024-04-27 06:50:51.798192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8626565610940643255 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.798215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.045 #44 NEW cov: 11806 ft: 15069 corp: 20/1118b lim: 100 exec/s: 44 rss: 69Mb L: 48/92 MS: 1 InsertByte- 00:08:22.045 [2024-04-27 06:50:51.848490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.848525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.045 [2024-04-27 06:50:51.848639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.848666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.045 [2024-04-27 06:50:51.848794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.848819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.045 #45 NEW cov: 11806 ft: 15081 corp: 21/1184b lim: 100 exec/s: 45 rss: 69Mb L: 66/92 MS: 1 ShuffleBytes- 00:08:22.045 [2024-04-27 06:50:51.908358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.908384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.045 [2024-04-27 06:50:51.908514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18426398401311539199 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.045 [2024-04-27 06:50:51.908539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.045 #46 NEW cov: 11806 ft: 15103 corp: 22/1231b lim: 100 exec/s: 46 rss: 69Mb L: 47/92 MS: 1 CrossOver- 00:08:22.304 [2024-04-27 06:50:51.959064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:51.959099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.304 [2024-04-27 06:50:51.959168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:51.959190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.304 [2024-04-27 06:50:51.959315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:51.959339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.304 [2024-04-27 06:50:51.959473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:51.959493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.304 #47 NEW cov: 11806 ft: 15131 corp: 23/1324b lim: 100 exec/s: 47 rss: 69Mb L: 93/93 MS: 1 CrossOver- 00:08:22.304 [2024-04-27 06:50:52.008704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:52.008737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.304 [2024-04-27 06:50:52.008850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:52.008873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.304 #48 NEW cov: 11806 ft: 15163 corp: 24/1366b lim: 100 exec/s: 48 rss: 69Mb L: 42/93 MS: 1 ShuffleBytes- 00:08:22.304 [2024-04-27 06:50:52.058864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:52.058898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.304 [2024-04-27 06:50:52.059036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251626288887735 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:52.059064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.304 #49 NEW cov: 11806 ft: 15182 corp: 25/1413b lim: 100 exec/s: 49 rss: 69Mb L: 47/93 MS: 1 ShuffleBytes- 00:08:22.304 [2024-04-27 06:50:52.108768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:52.108794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.304 #50 NEW cov: 11806 ft: 15207 corp: 26/1437b lim: 100 exec/s: 50 rss: 69Mb L: 24/93 MS: 1 EraseBytes- 00:08:22.304 [2024-04-27 06:50:52.169270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:52.169300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.304 [2024-04-27 06:50:52.169428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251626285772727 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.304 [2024-04-27 06:50:52.169450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.304 #51 NEW cov: 11806 ft: 15234 corp: 27/1485b lim: 100 exec/s: 51 rss: 69Mb L: 48/93 MS: 1 InsertByte- 00:08:22.563 [2024-04-27 06:50:52.219711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.219745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.219871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.219896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.220017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.220038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.563 #52 NEW cov: 11806 ft: 15238 corp: 28/1557b lim: 100 exec/s: 52 rss: 69Mb L: 72/93 MS: 1 CrossOver- 00:08:22.563 [2024-04-27 06:50:52.269769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.269799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.269905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13189556458397087671 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.269928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.270063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13238251626885003191 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.270088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.563 #53 NEW cov: 11806 ft: 15241 corp: 29/1626b lim: 100 exec/s: 53 rss: 69Mb L: 69/93 MS: 1 CrossOver- 00:08:22.563 [2024-04-27 06:50:52.319964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.319994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.320108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.320131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.320260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.320284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.563 #54 NEW cov: 11806 ft: 15251 corp: 30/1692b lim: 100 exec/s: 54 rss: 69Mb L: 66/93 MS: 1 ChangeByte- 00:08:22.563 [2024-04-27 06:50:52.370082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.370113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.370222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4195730026713823159 len:14907 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.370247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.370371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4195730024608447034 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.370400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.563 #55 NEW cov: 11806 ft: 15259 corp: 31/1764b lim: 100 exec/s: 55 rss: 69Mb L: 72/93 MS: 1 InsertByte- 00:08:22.563 [2024-04-27 06:50:52.420189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744071041974271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.420216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.563 [2024-04-27 06:50:52.420364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.563 [2024-04-27 06:50:52.420389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.563 #56 NEW cov: 11806 ft: 15318 corp: 32/1806b lim: 100 exec/s: 56 rss: 69Mb L: 42/93 MS: 1 ShuffleBytes- 00:08:22.823 [2024-04-27 06:50:52.470692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:100663296 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.823 [2024-04-27 06:50:52.470724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.823 [2024-04-27 06:50:52.470825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4294967040 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.823 [2024-04-27 06:50:52.470850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.823 [2024-04-27 06:50:52.470984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13238251629368031159 len:46907 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.823 [2024-04-27 06:50:52.471008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.823 [2024-04-27 06:50:52.471139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4195730024608447034 len:14907 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.823 [2024-04-27 06:50:52.471160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.823 #57 NEW cov: 11806 ft: 15330 corp: 33/1904b lim: 100 exec/s: 57 rss: 69Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:22.823 [2024-04-27 06:50:52.520653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13238251626398463927 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.823 [2024-04-27 06:50:52.520687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.823 [2024-04-27 06:50:52.520783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4195730026713823159 len:14907 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.823 [2024-04-27 06:50:52.520806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.823 [2024-04-27 06:50:52.520931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4195730024608447034 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.823 [2024-04-27 06:50:52.520952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.823 #58 NEW cov: 11806 ft: 15331 corp: 34/1976b lim: 100 exec/s: 29 rss: 70Mb L: 72/98 MS: 1 ChangeByte- 00:08:22.823 #58 DONE cov: 11806 ft: 15331 corp: 34/1976b lim: 100 exec/s: 29 rss: 70Mb 00:08:22.823 Done 58 runs in 2 second(s) 00:08:22.823 06:50:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:22.823 06:50:52 -- ../common.sh@72 -- # (( i++ )) 00:08:22.823 06:50:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.823 06:50:52 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:22.823 00:08:22.823 real 1m2.486s 00:08:22.823 user 1m38.915s 00:08:22.823 sys 0m7.200s 00:08:22.823 06:50:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:22.823 06:50:52 -- common/autotest_common.sh@10 -- # set +x 00:08:22.823 ************************************ 00:08:22.823 END TEST nvmf_fuzz 00:08:22.823 ************************************ 00:08:22.823 06:50:52 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:22.823 06:50:52 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:22.823 06:50:52 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:22.823 06:50:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:22.823 06:50:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:22.823 06:50:52 -- common/autotest_common.sh@10 -- # set +x 00:08:22.823 ************************************ 00:08:22.823 START TEST vfio_fuzz 00:08:22.823 ************************************ 00:08:22.823 06:50:52 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:23.084 * Looking for test storage... 00:08:23.084 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:23.084 06:50:52 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:23.084 06:50:52 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:23.084 06:50:52 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:23.084 06:50:52 -- common/autotest_common.sh@34 -- # set -e 00:08:23.084 06:50:52 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:23.084 06:50:52 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:23.084 06:50:52 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:23.084 06:50:52 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:23.084 06:50:52 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:23.084 06:50:52 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:23.084 06:50:52 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:23.084 06:50:52 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:23.084 06:50:52 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:23.084 06:50:52 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:23.084 06:50:52 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:23.084 06:50:52 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:23.084 06:50:52 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:23.084 06:50:52 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:23.084 06:50:52 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:23.085 06:50:52 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:23.085 06:50:52 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:23.085 06:50:52 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:23.085 06:50:52 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:23.085 06:50:52 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:23.085 06:50:52 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:23.085 06:50:52 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:23.085 06:50:52 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:23.085 06:50:52 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:23.085 06:50:52 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:23.085 06:50:52 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:23.085 06:50:52 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:23.085 06:50:52 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:23.085 06:50:52 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:23.085 06:50:52 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:23.085 06:50:52 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:23.085 06:50:52 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:23.085 06:50:52 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:23.085 06:50:52 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:23.085 06:50:52 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:23.085 06:50:52 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:23.085 06:50:52 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:23.085 06:50:52 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:23.085 06:50:52 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:23.085 06:50:52 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:23.085 06:50:52 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:23.085 06:50:52 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:23.085 06:50:52 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:23.085 06:50:52 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:23.085 06:50:52 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:23.085 06:50:52 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:23.085 06:50:52 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:23.085 06:50:52 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:23.085 06:50:52 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:23.085 06:50:52 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:23.085 06:50:52 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:23.085 06:50:52 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:23.085 06:50:52 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:23.085 06:50:52 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:23.085 06:50:52 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:23.085 06:50:52 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:23.085 06:50:52 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:23.085 06:50:52 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:23.085 06:50:52 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:23.085 06:50:52 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:23.085 06:50:52 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:23.085 06:50:52 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:23.085 06:50:52 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:23.085 06:50:52 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=n 00:08:23.085 06:50:52 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:23.085 06:50:52 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:23.085 06:50:52 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:23.085 06:50:52 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:23.085 06:50:52 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:23.085 06:50:52 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:23.085 06:50:52 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:23.085 06:50:52 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:23.085 06:50:52 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:23.085 06:50:52 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:23.085 06:50:52 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:23.085 06:50:52 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:23.085 06:50:52 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:23.085 06:50:52 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:23.085 06:50:52 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:23.085 06:50:52 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:23.085 06:50:52 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:23.085 06:50:52 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:23.085 06:50:52 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:23.085 06:50:52 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:23.085 06:50:52 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:23.085 06:50:52 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:23.085 06:50:52 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:23.085 06:50:52 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:23.085 06:50:52 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:23.085 06:50:52 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:23.085 06:50:52 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:23.085 06:50:52 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:23.085 06:50:52 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:23.085 06:50:52 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:23.085 06:50:52 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:23.085 06:50:52 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:23.085 06:50:52 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:23.085 06:50:52 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:23.085 06:50:52 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:23.085 #define SPDK_CONFIG_H 00:08:23.085 #define SPDK_CONFIG_APPS 1 00:08:23.085 #define SPDK_CONFIG_ARCH native 00:08:23.085 #undef SPDK_CONFIG_ASAN 00:08:23.085 #undef SPDK_CONFIG_AVAHI 00:08:23.085 #undef SPDK_CONFIG_CET 00:08:23.085 #define SPDK_CONFIG_COVERAGE 1 00:08:23.085 #define SPDK_CONFIG_CROSS_PREFIX 00:08:23.085 #undef SPDK_CONFIG_CRYPTO 00:08:23.085 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:23.085 #undef SPDK_CONFIG_CUSTOMOCF 00:08:23.085 #undef SPDK_CONFIG_DAOS 00:08:23.085 #define SPDK_CONFIG_DAOS_DIR 00:08:23.085 #define SPDK_CONFIG_DEBUG 1 00:08:23.085 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:23.085 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:23.085 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:23.085 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:23.085 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:23.085 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:23.085 #define SPDK_CONFIG_EXAMPLES 1 00:08:23.085 #undef SPDK_CONFIG_FC 00:08:23.085 #define SPDK_CONFIG_FC_PATH 00:08:23.085 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:23.085 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:23.085 #undef SPDK_CONFIG_FUSE 00:08:23.085 #define SPDK_CONFIG_FUZZER 1 00:08:23.085 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:23.085 #undef SPDK_CONFIG_GOLANG 00:08:23.085 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:23.085 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:23.085 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:23.085 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:23.085 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:23.085 #define SPDK_CONFIG_IDXD 1 00:08:23.085 #undef SPDK_CONFIG_IDXD_KERNEL 00:08:23.085 #undef SPDK_CONFIG_IPSEC_MB 00:08:23.085 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:23.085 #define SPDK_CONFIG_ISAL 1 00:08:23.085 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:23.085 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:23.085 #define SPDK_CONFIG_LIBDIR 00:08:23.085 #undef SPDK_CONFIG_LTO 00:08:23.085 #define SPDK_CONFIG_MAX_LCORES 00:08:23.085 #define SPDK_CONFIG_NVME_CUSE 1 00:08:23.085 #undef SPDK_CONFIG_OCF 00:08:23.085 #define SPDK_CONFIG_OCF_PATH 00:08:23.085 #define SPDK_CONFIG_OPENSSL_PATH 00:08:23.085 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:23.085 #undef SPDK_CONFIG_PGO_USE 00:08:23.085 #define SPDK_CONFIG_PREFIX /usr/local 00:08:23.085 #undef SPDK_CONFIG_RAID5F 00:08:23.085 #undef SPDK_CONFIG_RBD 00:08:23.085 #define SPDK_CONFIG_RDMA 1 00:08:23.085 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:23.085 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:23.085 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:23.085 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:23.085 #undef SPDK_CONFIG_SHARED 00:08:23.085 #undef SPDK_CONFIG_SMA 00:08:23.085 #define SPDK_CONFIG_TESTS 1 00:08:23.085 #undef SPDK_CONFIG_TSAN 00:08:23.085 #define SPDK_CONFIG_UBLK 1 00:08:23.085 #define SPDK_CONFIG_UBSAN 1 00:08:23.085 #undef SPDK_CONFIG_UNIT_TESTS 00:08:23.085 #undef SPDK_CONFIG_URING 00:08:23.086 #define SPDK_CONFIG_URING_PATH 00:08:23.086 #undef SPDK_CONFIG_URING_ZNS 00:08:23.086 #undef SPDK_CONFIG_USDT 00:08:23.086 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:23.086 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:23.086 #define SPDK_CONFIG_VFIO_USER 1 00:08:23.086 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:23.086 #define SPDK_CONFIG_VHOST 1 00:08:23.086 #define SPDK_CONFIG_VIRTIO 1 00:08:23.086 #undef SPDK_CONFIG_VTUNE 00:08:23.086 #define SPDK_CONFIG_VTUNE_DIR 00:08:23.086 #define SPDK_CONFIG_WERROR 1 00:08:23.086 #define SPDK_CONFIG_WPDK_DIR 00:08:23.086 #undef SPDK_CONFIG_XNVME 00:08:23.086 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:23.086 06:50:52 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:23.086 06:50:52 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:23.086 06:50:52 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:23.086 06:50:52 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:23.086 06:50:52 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:23.086 06:50:52 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:23.086 06:50:52 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:23.086 06:50:52 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:23.086 06:50:52 -- paths/export.sh@5 -- # export PATH 00:08:23.086 06:50:52 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:23.086 06:50:52 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:23.086 06:50:52 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:23.086 06:50:52 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:23.086 06:50:52 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:23.086 06:50:52 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:23.086 06:50:52 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:23.086 06:50:52 -- pm/common@16 -- # TEST_TAG=N/A 00:08:23.086 06:50:52 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:23.086 06:50:52 -- common/autotest_common.sh@52 -- # : 1 00:08:23.086 06:50:52 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:23.086 06:50:52 -- common/autotest_common.sh@56 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:23.086 06:50:52 -- common/autotest_common.sh@58 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:23.086 06:50:52 -- common/autotest_common.sh@60 -- # : 1 00:08:23.086 06:50:52 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:23.086 06:50:52 -- common/autotest_common.sh@62 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:23.086 06:50:52 -- common/autotest_common.sh@64 -- # : 00:08:23.086 06:50:52 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:23.086 06:50:52 -- common/autotest_common.sh@66 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:23.086 06:50:52 -- common/autotest_common.sh@68 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:23.086 06:50:52 -- common/autotest_common.sh@70 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:23.086 06:50:52 -- common/autotest_common.sh@72 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:23.086 06:50:52 -- common/autotest_common.sh@74 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:23.086 06:50:52 -- common/autotest_common.sh@76 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:23.086 06:50:52 -- common/autotest_common.sh@78 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:23.086 06:50:52 -- common/autotest_common.sh@80 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:23.086 06:50:52 -- common/autotest_common.sh@82 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:23.086 06:50:52 -- common/autotest_common.sh@84 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:23.086 06:50:52 -- common/autotest_common.sh@86 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:23.086 06:50:52 -- common/autotest_common.sh@88 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:23.086 06:50:52 -- common/autotest_common.sh@90 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:23.086 06:50:52 -- common/autotest_common.sh@92 -- # : 1 00:08:23.086 06:50:52 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:23.086 06:50:52 -- common/autotest_common.sh@94 -- # : 1 00:08:23.086 06:50:52 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:23.086 06:50:52 -- common/autotest_common.sh@96 -- # : rdma 00:08:23.086 06:50:52 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:23.086 06:50:52 -- common/autotest_common.sh@98 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:23.086 06:50:52 -- common/autotest_common.sh@100 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:23.086 06:50:52 -- common/autotest_common.sh@102 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:23.086 06:50:52 -- common/autotest_common.sh@104 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:23.086 06:50:52 -- common/autotest_common.sh@106 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:23.086 06:50:52 -- common/autotest_common.sh@108 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:23.086 06:50:52 -- common/autotest_common.sh@110 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:23.086 06:50:52 -- common/autotest_common.sh@112 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:23.086 06:50:52 -- common/autotest_common.sh@114 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:23.086 06:50:52 -- common/autotest_common.sh@116 -- # : 1 00:08:23.086 06:50:52 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:23.086 06:50:52 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:23.086 06:50:52 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:23.086 06:50:52 -- common/autotest_common.sh@120 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:23.086 06:50:52 -- common/autotest_common.sh@122 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:23.086 06:50:52 -- common/autotest_common.sh@124 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:23.086 06:50:52 -- common/autotest_common.sh@126 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:23.086 06:50:52 -- common/autotest_common.sh@128 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:23.086 06:50:52 -- common/autotest_common.sh@130 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:23.086 06:50:52 -- common/autotest_common.sh@132 -- # : v23.11 00:08:23.086 06:50:52 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:23.086 06:50:52 -- common/autotest_common.sh@134 -- # : true 00:08:23.086 06:50:52 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:23.086 06:50:52 -- common/autotest_common.sh@136 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:23.086 06:50:52 -- common/autotest_common.sh@138 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:23.086 06:50:52 -- common/autotest_common.sh@140 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:23.086 06:50:52 -- common/autotest_common.sh@142 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:23.086 06:50:52 -- common/autotest_common.sh@144 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:23.086 06:50:52 -- common/autotest_common.sh@146 -- # : 0 00:08:23.086 06:50:52 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:23.086 06:50:52 -- common/autotest_common.sh@148 -- # : 00:08:23.086 06:50:52 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:23.086 06:50:52 -- common/autotest_common.sh@150 -- # : 0 00:08:23.087 06:50:52 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:23.087 06:50:52 -- common/autotest_common.sh@152 -- # : 0 00:08:23.087 06:50:52 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:23.087 06:50:52 -- common/autotest_common.sh@154 -- # : 0 00:08:23.087 06:50:52 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:23.087 06:50:52 -- common/autotest_common.sh@156 -- # : 0 00:08:23.087 06:50:52 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:23.087 06:50:52 -- common/autotest_common.sh@158 -- # : 0 00:08:23.087 06:50:52 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:23.087 06:50:52 -- common/autotest_common.sh@160 -- # : 0 00:08:23.087 06:50:52 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:23.087 06:50:52 -- common/autotest_common.sh@163 -- # : 00:08:23.087 06:50:52 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:23.087 06:50:52 -- common/autotest_common.sh@165 -- # : 0 00:08:23.087 06:50:52 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:23.087 06:50:52 -- common/autotest_common.sh@167 -- # : 0 00:08:23.087 06:50:52 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:23.087 06:50:52 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:23.087 06:50:52 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:23.087 06:50:52 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:23.087 06:50:52 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:23.087 06:50:52 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:23.087 06:50:52 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:23.087 06:50:52 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:23.087 06:50:52 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:23.087 06:50:52 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:23.087 06:50:52 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:23.087 06:50:52 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:23.087 06:50:52 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:23.087 06:50:52 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:23.087 06:50:52 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:23.087 06:50:52 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:23.087 06:50:52 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:23.087 06:50:52 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:23.087 06:50:52 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:23.087 06:50:52 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:23.087 06:50:52 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:23.087 06:50:52 -- common/autotest_common.sh@196 -- # cat 00:08:23.087 06:50:52 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:23.087 06:50:52 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:23.087 06:50:52 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:23.087 06:50:52 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:23.087 06:50:52 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:23.087 06:50:52 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:23.087 06:50:52 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:23.087 06:50:52 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:23.087 06:50:52 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:23.087 06:50:52 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:23.087 06:50:52 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:23.087 06:50:52 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:23.087 06:50:52 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:23.087 06:50:52 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:23.087 06:50:52 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:23.087 06:50:52 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:23.087 06:50:52 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:23.087 06:50:52 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:23.087 06:50:52 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:23.087 06:50:52 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:23.087 06:50:52 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:23.087 06:50:52 -- common/autotest_common.sh@249 -- # valgrind= 00:08:23.087 06:50:52 -- common/autotest_common.sh@255 -- # uname -s 00:08:23.087 06:50:52 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:23.087 06:50:52 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:23.087 06:50:52 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:23.087 06:50:52 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:23.087 06:50:52 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:23.087 06:50:52 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:23.087 06:50:52 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:23.087 06:50:52 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:23.087 06:50:52 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:23.087 06:50:52 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:23.087 06:50:52 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:23.087 06:50:52 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:23.087 06:50:52 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:23.087 06:50:52 -- common/autotest_common.sh@309 -- # [[ -z 2629614 ]] 00:08:23.087 06:50:52 -- common/autotest_common.sh@309 -- # kill -0 2629614 00:08:23.087 06:50:52 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:23.087 06:50:52 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:23.087 06:50:52 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:23.087 06:50:52 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:23.087 06:50:52 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:23.087 06:50:52 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:23.087 06:50:52 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:23.087 06:50:52 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:23.087 06:50:52 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.X0E9pV 00:08:23.087 06:50:52 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:23.087 06:50:52 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:23.087 06:50:52 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:23.087 06:50:52 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.X0E9pV/tests/vfio /tmp/spdk.X0E9pV 00:08:23.087 06:50:52 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:23.087 06:50:52 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:23.087 06:50:52 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:23.087 06:50:52 -- common/autotest_common.sh@318 -- # df -T 00:08:23.087 06:50:52 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:23.087 06:50:52 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:23.087 06:50:52 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:23.087 06:50:52 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:23.087 06:50:52 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:23.088 06:50:52 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # avails["$mount"]=1052192768 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:23.088 06:50:52 -- common/autotest_common.sh@354 -- # uses["$mount"]=4232237056 00:08:23.088 06:50:52 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # avails["$mount"]=52181766144 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742297088 00:08:23.088 06:50:52 -- common/autotest_common.sh@354 -- # uses["$mount"]=9560530944 00:08:23.088 06:50:52 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868553728 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871146496 00:08:23.088 06:50:52 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:08:23.088 06:50:52 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342489088 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348461056 00:08:23.088 06:50:52 -- common/autotest_common.sh@354 -- # uses["$mount"]=5971968 00:08:23.088 06:50:52 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870499328 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871150592 00:08:23.088 06:50:52 -- common/autotest_common.sh@354 -- # uses["$mount"]=651264 00:08:23.088 06:50:52 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:23.088 06:50:52 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:08:23.088 06:50:52 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:08:23.088 06:50:52 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:08:23.088 06:50:52 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:23.088 06:50:52 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:23.088 * Looking for test storage... 00:08:23.088 06:50:52 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:23.088 06:50:52 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:23.088 06:50:52 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:23.088 06:50:52 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:23.088 06:50:52 -- common/autotest_common.sh@363 -- # mount=/ 00:08:23.088 06:50:52 -- common/autotest_common.sh@365 -- # target_space=52181766144 00:08:23.088 06:50:52 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:23.088 06:50:52 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:23.088 06:50:52 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:23.088 06:50:52 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:23.088 06:50:52 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:23.088 06:50:52 -- common/autotest_common.sh@372 -- # new_size=11775123456 00:08:23.088 06:50:52 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:23.088 06:50:52 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:23.088 06:50:52 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:23.088 06:50:52 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:23.088 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:23.088 06:50:52 -- common/autotest_common.sh@380 -- # return 0 00:08:23.088 06:50:52 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:23.088 06:50:52 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:23.088 06:50:52 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:23.088 06:50:52 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:23.088 06:50:52 -- common/autotest_common.sh@1672 -- # true 00:08:23.088 06:50:52 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:23.088 06:50:52 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:23.088 06:50:52 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:23.088 06:50:52 -- common/autotest_common.sh@27 -- # exec 00:08:23.088 06:50:52 -- common/autotest_common.sh@29 -- # exec 00:08:23.088 06:50:52 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:23.088 06:50:52 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:23.088 06:50:52 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:23.088 06:50:52 -- common/autotest_common.sh@18 -- # set -x 00:08:23.088 06:50:52 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:23.088 06:50:52 -- ../common.sh@8 -- # pids=() 00:08:23.088 06:50:52 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:23.088 06:50:52 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:23.088 06:50:52 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:23.088 06:50:52 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:23.088 06:50:52 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:23.088 06:50:52 -- vfio/run.sh@65 -- # mem_size=0 00:08:23.088 06:50:52 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:23.088 06:50:52 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:23.088 06:50:52 -- ../common.sh@69 -- # local fuzz_num=7 00:08:23.088 06:50:52 -- ../common.sh@70 -- # local time=1 00:08:23.088 06:50:52 -- ../common.sh@72 -- # (( i = 0 )) 00:08:23.088 06:50:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.088 06:50:52 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:23.088 06:50:52 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:23.088 06:50:52 -- vfio/run.sh@23 -- # local timen=1 00:08:23.088 06:50:52 -- vfio/run.sh@24 -- # local core=0x1 00:08:23.088 06:50:52 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:23.088 06:50:52 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:23.088 06:50:52 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:23.088 06:50:52 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:23.088 06:50:52 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:23.088 06:50:52 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:23.347 06:50:52 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:23.347 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:23.347 06:50:52 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:23.348 [2024-04-27 06:50:53.010862] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:23.348 [2024-04-27 06:50:53.010927] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2629661 ] 00:08:23.348 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.348 [2024-04-27 06:50:53.081387] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.348 [2024-04-27 06:50:53.117386] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:23.348 [2024-04-27 06:50:53.117552] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.606 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.606 INFO: Seed: 337938437 00:08:23.606 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:23.606 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:23.606 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:23.606 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.606 #2 INITED exec/s: 0 rss: 60Mb 00:08:23.606 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.606 This may also happen if the target rejected all inputs we tried so far 00:08:24.182 NEW_FUNC[1/621]: 0x49cfc0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:24.182 NEW_FUNC[2/621]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:24.182 #4 NEW cov: 10704 ft: 10678 corp: 2/25b lim: 60 exec/s: 0 rss: 66Mb L: 24/24 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:24.182 #5 NEW cov: 10721 ft: 13400 corp: 3/49b lim: 60 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 ShuffleBytes- 00:08:24.441 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.441 #6 NEW cov: 10738 ft: 14621 corp: 4/95b lim: 60 exec/s: 0 rss: 68Mb L: 46/46 MS: 1 InsertRepeatedBytes- 00:08:24.441 #7 NEW cov: 10738 ft: 15064 corp: 5/104b lim: 60 exec/s: 0 rss: 68Mb L: 9/46 MS: 1 CMP- DE: "\001\000\000\000\000\000\002\000"- 00:08:24.699 #8 NEW cov: 10738 ft: 15289 corp: 6/151b lim: 60 exec/s: 8 rss: 68Mb L: 47/47 MS: 1 InsertByte- 00:08:24.958 #9 NEW cov: 10738 ft: 15581 corp: 7/160b lim: 60 exec/s: 9 rss: 68Mb L: 9/47 MS: 1 ChangeByte- 00:08:24.958 #10 NEW cov: 10738 ft: 15780 corp: 8/181b lim: 60 exec/s: 10 rss: 68Mb L: 21/47 MS: 1 InsertRepeatedBytes- 00:08:25.218 #11 NEW cov: 10738 ft: 15989 corp: 9/205b lim: 60 exec/s: 11 rss: 68Mb L: 24/47 MS: 1 ChangeBinInt- 00:08:25.477 #12 NEW cov: 10745 ft: 16206 corp: 10/222b lim: 60 exec/s: 12 rss: 68Mb L: 17/47 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\002\000"- 00:08:25.477 #13 NEW cov: 10745 ft: 16535 corp: 11/231b lim: 60 exec/s: 6 rss: 68Mb L: 9/47 MS: 1 ShuffleBytes- 00:08:25.477 #13 DONE cov: 10745 ft: 16535 corp: 11/231b lim: 60 exec/s: 6 rss: 68Mb 00:08:25.477 ###### Recommended dictionary. ###### 00:08:25.477 "\001\000\000\000\000\000\002\000" # Uses: 1 00:08:25.477 ###### End of recommended dictionary. ###### 00:08:25.477 Done 13 runs in 2 second(s) 00:08:25.737 06:50:55 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:25.737 06:50:55 -- ../common.sh@72 -- # (( i++ )) 00:08:25.737 06:50:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.737 06:50:55 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:25.737 06:50:55 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:25.737 06:50:55 -- vfio/run.sh@23 -- # local timen=1 00:08:25.737 06:50:55 -- vfio/run.sh@24 -- # local core=0x1 00:08:25.737 06:50:55 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:25.737 06:50:55 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:25.737 06:50:55 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:25.737 06:50:55 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:25.737 06:50:55 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:25.737 06:50:55 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:25.737 06:50:55 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:25.737 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:25.737 06:50:55 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:25.737 [2024-04-27 06:50:55.629170] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:25.737 [2024-04-27 06:50:55.629267] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2630197 ] 00:08:25.996 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.996 [2024-04-27 06:50:55.709495] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.996 [2024-04-27 06:50:55.744594] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.996 [2024-04-27 06:50:55.744737] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.255 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.255 INFO: Seed: 2963938193 00:08:26.255 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:26.255 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:26.255 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:26.255 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.255 #2 INITED exec/s: 0 rss: 60Mb 00:08:26.255 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.255 This may also happen if the target rejected all inputs we tried so far 00:08:26.255 [2024-04-27 06:50:56.025449] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:26.255 [2024-04-27 06:50:56.025483] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:26.255 [2024-04-27 06:50:56.025502] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:26.514 NEW_FUNC[1/628]: 0x49d560 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:26.514 NEW_FUNC[2/628]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:26.514 #3 NEW cov: 10718 ft: 10414 corp: 2/18b lim: 40 exec/s: 0 rss: 66Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:08:26.773 [2024-04-27 06:50:56.485642] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:26.773 [2024-04-27 06:50:56.485679] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:26.773 [2024-04-27 06:50:56.485696] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:26.773 #4 NEW cov: 10732 ft: 14091 corp: 3/40b lim: 40 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:27.032 [2024-04-27 06:50:56.678375] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:27.032 [2024-04-27 06:50:56.678405] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:27.032 [2024-04-27 06:50:56.678423] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:27.032 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:27.032 #5 NEW cov: 10749 ft: 15078 corp: 4/63b lim: 40 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 InsertByte- 00:08:27.032 [2024-04-27 06:50:56.859794] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:27.032 [2024-04-27 06:50:56.859815] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:27.032 [2024-04-27 06:50:56.859831] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:27.291 #6 NEW cov: 10749 ft: 15817 corp: 5/86b lim: 40 exec/s: 6 rss: 68Mb L: 23/23 MS: 1 ChangeBit- 00:08:27.291 [2024-04-27 06:50:57.045762] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:27.291 [2024-04-27 06:50:57.045783] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:27.291 [2024-04-27 06:50:57.045800] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:27.291 #7 NEW cov: 10749 ft: 16815 corp: 6/108b lim: 40 exec/s: 7 rss: 70Mb L: 22/23 MS: 1 CrossOver- 00:08:27.549 [2024-04-27 06:50:57.227523] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:27.550 [2024-04-27 06:50:57.227544] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:27.550 [2024-04-27 06:50:57.227560] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:27.550 #8 NEW cov: 10749 ft: 17222 corp: 7/131b lim: 40 exec/s: 8 rss: 70Mb L: 23/23 MS: 1 ShuffleBytes- 00:08:27.550 [2024-04-27 06:50:57.410974] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:27.550 [2024-04-27 06:50:57.410996] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:27.550 [2024-04-27 06:50:57.411014] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:27.808 #9 NEW cov: 10749 ft: 17341 corp: 8/148b lim: 40 exec/s: 9 rss: 70Mb L: 17/23 MS: 1 ChangeByte- 00:08:27.808 [2024-04-27 06:50:57.592546] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:27.808 [2024-04-27 06:50:57.592566] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:27.808 [2024-04-27 06:50:57.592582] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:27.808 #10 NEW cov: 10749 ft: 17510 corp: 9/170b lim: 40 exec/s: 10 rss: 70Mb L: 22/23 MS: 1 ChangeByte- 00:08:28.067 [2024-04-27 06:50:57.773560] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:28.067 [2024-04-27 06:50:57.773585] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:28.067 [2024-04-27 06:50:57.773601] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:28.067 #11 NEW cov: 10756 ft: 17685 corp: 10/187b lim: 40 exec/s: 11 rss: 70Mb L: 17/23 MS: 1 ChangeBit- 00:08:28.067 [2024-04-27 06:50:57.956427] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:28.067 [2024-04-27 06:50:57.956459] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:28.067 [2024-04-27 06:50:57.956476] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:28.326 #12 NEW cov: 10756 ft: 17744 corp: 11/204b lim: 40 exec/s: 6 rss: 70Mb L: 17/23 MS: 1 ShuffleBytes- 00:08:28.326 #12 DONE cov: 10756 ft: 17744 corp: 11/204b lim: 40 exec/s: 6 rss: 70Mb 00:08:28.326 Done 12 runs in 2 second(s) 00:08:28.585 06:50:58 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:28.585 06:50:58 -- ../common.sh@72 -- # (( i++ )) 00:08:28.585 06:50:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.585 06:50:58 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:28.585 06:50:58 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:28.585 06:50:58 -- vfio/run.sh@23 -- # local timen=1 00:08:28.585 06:50:58 -- vfio/run.sh@24 -- # local core=0x1 00:08:28.585 06:50:58 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:28.585 06:50:58 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:28.585 06:50:58 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:28.585 06:50:58 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:28.585 06:50:58 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:28.585 06:50:58 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:28.585 06:50:58 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:28.585 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:28.585 06:50:58 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:28.585 [2024-04-27 06:50:58.361740] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:28.585 [2024-04-27 06:50:58.361833] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2630715 ] 00:08:28.585 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.585 [2024-04-27 06:50:58.435405] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.585 [2024-04-27 06:50:58.470342] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:28.585 [2024-04-27 06:50:58.470504] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.844 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.844 INFO: Seed: 1392968471 00:08:28.844 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:28.844 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:28.844 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:28.844 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.844 #2 INITED exec/s: 0 rss: 60Mb 00:08:28.844 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.844 This may also happen if the target rejected all inputs we tried so far 00:08:29.103 [2024-04-27 06:50:58.752474] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:29.103 [2024-04-27 06:50:58.752527] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:29.362 NEW_FUNC[1/628]: 0x49df40 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:29.362 NEW_FUNC[2/628]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:29.362 #11 NEW cov: 10707 ft: 10625 corp: 2/50b lim: 80 exec/s: 0 rss: 66Mb L: 49/49 MS: 4 ShuffleBytes-ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:29.362 [2024-04-27 06:50:59.209859] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:29.362 [2024-04-27 06:50:59.209900] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:29.621 #12 NEW cov: 10724 ft: 14056 corp: 3/95b lim: 80 exec/s: 0 rss: 67Mb L: 45/49 MS: 1 EraseBytes- 00:08:29.621 [2024-04-27 06:50:59.389047] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:29.621 [2024-04-27 06:50:59.389076] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:29.621 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.621 #15 NEW cov: 10741 ft: 14813 corp: 4/103b lim: 80 exec/s: 0 rss: 68Mb L: 8/49 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:29.880 [2024-04-27 06:50:59.580224] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:29.880 #20 NEW cov: 10742 ft: 15841 corp: 5/176b lim: 80 exec/s: 20 rss: 68Mb L: 73/73 MS: 5 InsertByte-ChangeByte-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:29.880 [2024-04-27 06:50:59.761222] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:29.880 [2024-04-27 06:50:59.761251] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:30.141 #21 NEW cov: 10742 ft: 16083 corp: 6/248b lim: 80 exec/s: 21 rss: 68Mb L: 72/73 MS: 1 CopyPart- 00:08:30.141 [2024-04-27 06:50:59.941769] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:30.400 #22 NEW cov: 10742 ft: 16362 corp: 7/284b lim: 80 exec/s: 22 rss: 68Mb L: 36/73 MS: 1 InsertRepeatedBytes- 00:08:30.400 [2024-04-27 06:51:00.123883] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:30.400 [2024-04-27 06:51:00.123921] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:30.400 #23 NEW cov: 10742 ft: 16684 corp: 8/293b lim: 80 exec/s: 23 rss: 68Mb L: 9/73 MS: 1 InsertByte- 00:08:30.660 [2024-04-27 06:51:00.307207] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:30.660 #26 NEW cov: 10742 ft: 16743 corp: 9/334b lim: 80 exec/s: 26 rss: 68Mb L: 41/73 MS: 3 EraseBytes-InsertByte-CrossOver- 00:08:30.660 [2024-04-27 06:51:00.486131] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:30.660 [2024-04-27 06:51:00.486161] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:30.919 #27 NEW cov: 10749 ft: 16890 corp: 10/387b lim: 80 exec/s: 27 rss: 68Mb L: 53/73 MS: 1 CMP- DE: "\377\377~\227H\254\320`"- 00:08:30.919 [2024-04-27 06:51:00.664468] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:30.919 [2024-04-27 06:51:00.664498] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:30.919 #28 NEW cov: 10749 ft: 16935 corp: 11/464b lim: 80 exec/s: 14 rss: 68Mb L: 77/77 MS: 1 InsertRepeatedBytes- 00:08:30.919 #28 DONE cov: 10749 ft: 16935 corp: 11/464b lim: 80 exec/s: 14 rss: 68Mb 00:08:30.919 ###### Recommended dictionary. ###### 00:08:30.919 "\377\377~\227H\254\320`" # Uses: 0 00:08:30.919 ###### End of recommended dictionary. ###### 00:08:30.919 Done 28 runs in 2 second(s) 00:08:31.179 06:51:01 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:31.179 06:51:01 -- ../common.sh@72 -- # (( i++ )) 00:08:31.179 06:51:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.179 06:51:01 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:31.179 06:51:01 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:31.179 06:51:01 -- vfio/run.sh@23 -- # local timen=1 00:08:31.179 06:51:01 -- vfio/run.sh@24 -- # local core=0x1 00:08:31.179 06:51:01 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:31.179 06:51:01 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:31.179 06:51:01 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:31.179 06:51:01 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:31.179 06:51:01 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:31.179 06:51:01 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:31.179 06:51:01 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:31.179 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:31.179 06:51:01 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:31.179 [2024-04-27 06:51:01.058346] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:31.179 [2024-04-27 06:51:01.058420] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2631141 ] 00:08:31.440 EAL: No free 2048 kB hugepages reported on node 1 00:08:31.440 [2024-04-27 06:51:01.125745] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.440 [2024-04-27 06:51:01.161945] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:31.440 [2024-04-27 06:51:01.162092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.700 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.700 INFO: Seed: 4084976517 00:08:31.700 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:31.700 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:31.700 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:31.700 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.700 #2 INITED exec/s: 0 rss: 62Mb 00:08:31.700 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.700 This may also happen if the target rejected all inputs we tried so far 00:08:31.700 [2024-04-27 06:51:01.445472] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:31.700 [2024-04-27 06:51:01.445506] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:31.700 [2024-04-27 06:51:01.445518] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:31.700 [2024-04-27 06:51:01.445537] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:31.959 NEW_FUNC[1/628]: 0x49e620 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:31.959 NEW_FUNC[2/628]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:31.959 #4 NEW cov: 10716 ft: 10597 corp: 2/101b lim: 320 exec/s: 0 rss: 68Mb L: 100/100 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:32.219 [2024-04-27 06:51:01.915047] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x5ff0000000000 prot=0x3: Invalid argument 00:08:32.219 [2024-04-27 06:51:01.915083] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x5ff0000000000 flags=0x3: Invalid argument 00:08:32.219 [2024-04-27 06:51:01.915095] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:32.219 [2024-04-27 06:51:01.915112] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:32.219 #5 NEW cov: 10730 ft: 13875 corp: 3/201b lim: 320 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 CMP- DE: "\377\005"- 00:08:32.219 [2024-04-27 06:51:02.100939] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x5ff0000000000 prot=0x3: Invalid argument 00:08:32.219 [2024-04-27 06:51:02.100963] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x5ff0000000000 flags=0x3: Invalid argument 00:08:32.219 [2024-04-27 06:51:02.100974] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:32.219 [2024-04-27 06:51:02.100990] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:32.479 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:32.479 #6 NEW cov: 10747 ft: 14934 corp: 4/303b lim: 320 exec/s: 0 rss: 70Mb L: 102/102 MS: 1 PersAutoDict- DE: "\377\005"- 00:08:32.479 [2024-04-27 06:51:02.285724] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x5ff0000000000 prot=0x3: Invalid argument 00:08:32.479 [2024-04-27 06:51:02.285746] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x5ff0000000000 flags=0x3: Invalid argument 00:08:32.479 [2024-04-27 06:51:02.285756] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:32.479 [2024-04-27 06:51:02.285772] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:32.738 #7 NEW cov: 10747 ft: 15639 corp: 5/371b lim: 320 exec/s: 7 rss: 70Mb L: 68/102 MS: 1 EraseBytes- 00:08:32.738 [2024-04-27 06:51:02.468414] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x5ff0000000000 prot=0x3: Invalid argument 00:08:32.738 [2024-04-27 06:51:02.468437] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x5ff0000000000 flags=0x3: Invalid argument 00:08:32.738 [2024-04-27 06:51:02.468448] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:32.738 [2024-04-27 06:51:02.468464] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:32.738 #8 NEW cov: 10747 ft: 15723 corp: 6/473b lim: 320 exec/s: 8 rss: 70Mb L: 102/102 MS: 1 ShuffleBytes- 00:08:32.997 [2024-04-27 06:51:02.651004] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:32.997 [2024-04-27 06:51:02.651027] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:32.997 [2024-04-27 06:51:02.651037] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:32.997 [2024-04-27 06:51:02.651053] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:32.997 #9 NEW cov: 10747 ft: 16625 corp: 7/590b lim: 320 exec/s: 9 rss: 70Mb L: 117/117 MS: 1 InsertRepeatedBytes- 00:08:32.997 [2024-04-27 06:51:02.841561] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x5ff0000000000 prot=0x3: Invalid argument 00:08:32.997 [2024-04-27 06:51:02.841584] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x5ff0000000000 flags=0x3: Invalid argument 00:08:32.997 [2024-04-27 06:51:02.841594] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:32.997 [2024-04-27 06:51:02.841627] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:33.255 #10 NEW cov: 10747 ft: 16806 corp: 8/658b lim: 320 exec/s: 10 rss: 70Mb L: 68/117 MS: 1 ChangeBit- 00:08:33.255 [2024-04-27 06:51:03.026208] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x5ff0000000000 prot=0x3: Invalid argument 00:08:33.255 [2024-04-27 06:51:03.026231] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x5ff0000000000 flags=0x3: Invalid argument 00:08:33.255 [2024-04-27 06:51:03.026246] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:33.255 [2024-04-27 06:51:03.026262] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:33.255 #11 NEW cov: 10747 ft: 16975 corp: 9/761b lim: 320 exec/s: 11 rss: 70Mb L: 103/117 MS: 1 InsertByte- 00:08:33.512 #12 NEW cov: 10758 ft: 17337 corp: 10/841b lim: 320 exec/s: 12 rss: 70Mb L: 80/117 MS: 1 InsertRepeatedBytes- 00:08:33.512 [2024-04-27 06:51:03.394341] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x5ff0000000000 prot=0x3: Invalid argument 00:08:33.512 [2024-04-27 06:51:03.394365] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x5ff0000000000 flags=0x3: Invalid argument 00:08:33.512 [2024-04-27 06:51:03.394375] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:33.512 [2024-04-27 06:51:03.394392] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:33.770 #13 NEW cov: 10758 ft: 17721 corp: 11/930b lim: 320 exec/s: 6 rss: 70Mb L: 89/117 MS: 1 EraseBytes- 00:08:33.770 #13 DONE cov: 10758 ft: 17721 corp: 11/930b lim: 320 exec/s: 6 rss: 70Mb 00:08:33.770 ###### Recommended dictionary. ###### 00:08:33.770 "\377\005" # Uses: 1 00:08:33.770 ###### End of recommended dictionary. ###### 00:08:33.770 Done 13 runs in 2 second(s) 00:08:34.029 06:51:03 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:34.029 06:51:03 -- ../common.sh@72 -- # (( i++ )) 00:08:34.029 06:51:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.029 06:51:03 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:34.029 06:51:03 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:34.029 06:51:03 -- vfio/run.sh@23 -- # local timen=1 00:08:34.029 06:51:03 -- vfio/run.sh@24 -- # local core=0x1 00:08:34.029 06:51:03 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:34.029 06:51:03 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:34.029 06:51:03 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:34.029 06:51:03 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:34.029 06:51:03 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:34.029 06:51:03 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:34.029 06:51:03 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:34.029 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:34.029 06:51:03 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:34.029 [2024-04-27 06:51:03.796104] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:34.029 [2024-04-27 06:51:03.796195] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2631711 ] 00:08:34.029 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.029 [2024-04-27 06:51:03.870224] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.029 [2024-04-27 06:51:03.907259] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:34.029 [2024-04-27 06:51:03.907409] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.287 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.287 INFO: Seed: 2538021394 00:08:34.287 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:34.287 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:34.287 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:34.287 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.287 #2 INITED exec/s: 0 rss: 61Mb 00:08:34.287 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.287 This may also happen if the target rejected all inputs we tried so far 00:08:34.810 NEW_FUNC[1/618]: 0x49eea0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:34.811 NEW_FUNC[2/618]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:34.811 #3 NEW cov: 10670 ft: 10587 corp: 2/116b lim: 320 exec/s: 0 rss: 66Mb L: 115/115 MS: 1 InsertRepeatedBytes- 00:08:35.074 NEW_FUNC[1/4]: 0x1c45b80 in spdk_thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1151 00:08:35.074 NEW_FUNC[2/4]: 0x1c46360 in thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1055 00:08:35.074 #9 NEW cov: 10710 ft: 13869 corp: 3/216b lim: 320 exec/s: 0 rss: 67Mb L: 100/115 MS: 1 EraseBytes- 00:08:35.332 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:35.332 #10 NEW cov: 10727 ft: 14360 corp: 4/316b lim: 320 exec/s: 0 rss: 68Mb L: 100/115 MS: 1 ChangeBit- 00:08:35.332 #11 NEW cov: 10727 ft: 15953 corp: 5/417b lim: 320 exec/s: 11 rss: 68Mb L: 101/115 MS: 1 InsertByte- 00:08:35.590 [2024-04-27 06:51:05.292855] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:35.590 [2024-04-27 06:51:05.292898] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:35.590 [2024-04-27 06:51:05.292910] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:35.590 [2024-04-27 06:51:05.292928] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:35.590 [2024-04-27 06:51:05.293875] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:35.590 [2024-04-27 06:51:05.293894] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:35.590 [2024-04-27 06:51:05.293911] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:35.590 NEW_FUNC[1/6]: 0x1347dd0 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:08:35.590 NEW_FUNC[2/6]: 0x1348060 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:08:35.590 #18 NEW cov: 10759 ft: 16448 corp: 6/532b lim: 320 exec/s: 18 rss: 68Mb L: 115/115 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:35.847 #19 NEW cov: 10759 ft: 16599 corp: 7/632b lim: 320 exec/s: 19 rss: 68Mb L: 100/115 MS: 1 CrossOver- 00:08:35.847 [2024-04-27 06:51:05.703951] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:35.847 [2024-04-27 06:51:05.703981] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:35.847 [2024-04-27 06:51:05.703993] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:35.847 [2024-04-27 06:51:05.704009] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:35.847 [2024-04-27 06:51:05.704950] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:35.848 [2024-04-27 06:51:05.704969] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:35.848 [2024-04-27 06:51:05.704985] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:36.105 #25 NEW cov: 10759 ft: 16879 corp: 8/747b lim: 320 exec/s: 25 rss: 68Mb L: 115/115 MS: 1 ChangeBinInt- 00:08:36.105 [2024-04-27 06:51:05.900345] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:36.105 [2024-04-27 06:51:05.900373] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:36.105 [2024-04-27 06:51:05.900383] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:36.105 [2024-04-27 06:51:05.900406] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:36.105 [2024-04-27 06:51:05.901356] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:36.105 [2024-04-27 06:51:05.901374] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:36.105 [2024-04-27 06:51:05.901389] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:36.363 #26 NEW cov: 10766 ft: 17172 corp: 9/863b lim: 320 exec/s: 26 rss: 69Mb L: 116/116 MS: 1 CrossOver- 00:08:36.363 #27 NEW cov: 10766 ft: 17277 corp: 10/964b lim: 320 exec/s: 13 rss: 69Mb L: 101/116 MS: 1 InsertByte- 00:08:36.363 #27 DONE cov: 10766 ft: 17277 corp: 10/964b lim: 320 exec/s: 13 rss: 69Mb 00:08:36.363 Done 27 runs in 2 second(s) 00:08:36.622 06:51:06 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:36.622 06:51:06 -- ../common.sh@72 -- # (( i++ )) 00:08:36.622 06:51:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.622 06:51:06 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:36.622 06:51:06 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:36.622 06:51:06 -- vfio/run.sh@23 -- # local timen=1 00:08:36.622 06:51:06 -- vfio/run.sh@24 -- # local core=0x1 00:08:36.622 06:51:06 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:36.622 06:51:06 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:36.622 06:51:06 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:36.622 06:51:06 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:36.622 06:51:06 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:36.622 06:51:06 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:36.622 06:51:06 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:36.622 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:36.622 06:51:06 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:36.622 [2024-04-27 06:51:06.504194] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:36.622 [2024-04-27 06:51:06.504265] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2632686 ] 00:08:36.881 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.881 [2024-04-27 06:51:06.576909] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.881 [2024-04-27 06:51:06.614062] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:36.881 [2024-04-27 06:51:06.614207] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.139 INFO: Running with entropic power schedule (0xFF, 100). 00:08:37.139 INFO: Seed: 953036794 00:08:37.139 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:37.139 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:37.139 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:37.139 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.139 #2 INITED exec/s: 0 rss: 60Mb 00:08:37.139 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.139 This may also happen if the target rejected all inputs we tried so far 00:08:37.139 [2024-04-27 06:51:06.899454] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:37.139 [2024-04-27 06:51:06.899499] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:37.397 NEW_FUNC[1/575]: 0x49f8a0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:37.397 NEW_FUNC[2/575]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:37.397 #8 NEW cov: 9632 ft: 10689 corp: 2/53b lim: 120 exec/s: 0 rss: 65Mb L: 52/52 MS: 1 InsertRepeatedBytes- 00:08:37.655 [2024-04-27 06:51:07.362188] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:37.655 [2024-04-27 06:51:07.362233] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:37.656 NEW_FUNC[1/53]: 0x13a3610 in sq_head_advance /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:585 00:08:37.656 NEW_FUNC[2/53]: 0x13a3a60 in consume_cmd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:2526 00:08:37.656 #9 NEW cov: 10738 ft: 14159 corp: 3/106b lim: 120 exec/s: 0 rss: 67Mb L: 53/53 MS: 1 InsertByte- 00:08:37.914 [2024-04-27 06:51:07.556601] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:37.914 [2024-04-27 06:51:07.556634] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:37.914 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:37.914 #10 NEW cov: 10755 ft: 14732 corp: 4/159b lim: 120 exec/s: 0 rss: 68Mb L: 53/53 MS: 1 CopyPart- 00:08:37.914 [2024-04-27 06:51:07.740310] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:37.914 [2024-04-27 06:51:07.740340] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:38.172 #11 NEW cov: 10755 ft: 15454 corp: 5/240b lim: 120 exec/s: 11 rss: 68Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:08:38.172 [2024-04-27 06:51:07.934374] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:38.172 [2024-04-27 06:51:07.934411] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:38.172 #12 NEW cov: 10755 ft: 15631 corp: 6/294b lim: 120 exec/s: 12 rss: 68Mb L: 54/81 MS: 1 InsertByte- 00:08:38.459 [2024-04-27 06:51:08.118211] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:38.459 [2024-04-27 06:51:08.118240] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:38.459 #13 NEW cov: 10755 ft: 16112 corp: 7/347b lim: 120 exec/s: 13 rss: 68Mb L: 53/81 MS: 1 ShuffleBytes- 00:08:38.459 [2024-04-27 06:51:08.303325] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:38.459 [2024-04-27 06:51:08.303354] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:38.719 #14 NEW cov: 10755 ft: 16167 corp: 8/436b lim: 120 exec/s: 14 rss: 68Mb L: 89/89 MS: 1 CrossOver- 00:08:38.719 [2024-04-27 06:51:08.485934] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:38.719 [2024-04-27 06:51:08.485965] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:38.719 #15 NEW cov: 10755 ft: 16768 corp: 9/548b lim: 120 exec/s: 15 rss: 68Mb L: 112/112 MS: 1 CrossOver- 00:08:38.978 [2024-04-27 06:51:08.670548] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:38.978 [2024-04-27 06:51:08.670578] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:38.978 #16 NEW cov: 10762 ft: 16981 corp: 10/602b lim: 120 exec/s: 16 rss: 68Mb L: 54/112 MS: 1 InsertByte- 00:08:38.978 [2024-04-27 06:51:08.854472] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:38.978 [2024-04-27 06:51:08.854501] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:39.240 #17 NEW cov: 10762 ft: 17052 corp: 11/715b lim: 120 exec/s: 8 rss: 68Mb L: 113/113 MS: 1 InsertByte- 00:08:39.240 #17 DONE cov: 10762 ft: 17052 corp: 11/715b lim: 120 exec/s: 8 rss: 68Mb 00:08:39.240 Done 17 runs in 2 second(s) 00:08:39.499 06:51:09 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:39.499 06:51:09 -- ../common.sh@72 -- # (( i++ )) 00:08:39.499 06:51:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.499 06:51:09 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:39.499 06:51:09 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:39.499 06:51:09 -- vfio/run.sh@23 -- # local timen=1 00:08:39.499 06:51:09 -- vfio/run.sh@24 -- # local core=0x1 00:08:39.499 06:51:09 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:39.499 06:51:09 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:39.499 06:51:09 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:39.499 06:51:09 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:39.499 06:51:09 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:39.499 06:51:09 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:39.499 06:51:09 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:39.499 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:39.499 06:51:09 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:39.499 [2024-04-27 06:51:09.267443] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:39.499 [2024-04-27 06:51:09.267540] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2633223 ] 00:08:39.499 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.499 [2024-04-27 06:51:09.339448] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.499 [2024-04-27 06:51:09.376577] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:39.499 [2024-04-27 06:51:09.376736] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.757 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.757 INFO: Seed: 3713037428 00:08:39.757 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:39.757 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:39.757 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:39.757 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.757 #2 INITED exec/s: 0 rss: 60Mb 00:08:39.757 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.757 This may also happen if the target rejected all inputs we tried so far 00:08:40.015 [2024-04-27 06:51:09.656433] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:40.015 [2024-04-27 06:51:09.656508] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:40.273 NEW_FUNC[1/628]: 0x4a0590 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:40.273 NEW_FUNC[2/628]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:40.273 #13 NEW cov: 10709 ft: 10468 corp: 2/16b lim: 90 exec/s: 0 rss: 66Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:08:40.273 [2024-04-27 06:51:10.136427] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:40.273 [2024-04-27 06:51:10.136477] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:40.531 #14 NEW cov: 10726 ft: 13478 corp: 3/31b lim: 90 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 ChangeBit- 00:08:40.531 [2024-04-27 06:51:10.314191] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:40.531 [2024-04-27 06:51:10.314228] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:40.531 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:40.531 #20 NEW cov: 10743 ft: 14893 corp: 4/54b lim: 90 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 CopyPart- 00:08:40.790 [2024-04-27 06:51:10.492697] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:40.790 [2024-04-27 06:51:10.492728] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:40.790 #21 NEW cov: 10743 ft: 15442 corp: 5/69b lim: 90 exec/s: 21 rss: 68Mb L: 15/23 MS: 1 ChangeBit- 00:08:40.790 [2024-04-27 06:51:10.669951] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:40.790 [2024-04-27 06:51:10.669980] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:41.048 #22 NEW cov: 10743 ft: 15993 corp: 6/84b lim: 90 exec/s: 22 rss: 68Mb L: 15/23 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:41.048 [2024-04-27 06:51:10.848571] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:41.048 [2024-04-27 06:51:10.848602] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:41.308 #23 NEW cov: 10743 ft: 16152 corp: 7/100b lim: 90 exec/s: 23 rss: 68Mb L: 16/23 MS: 1 InsertByte- 00:08:41.308 [2024-04-27 06:51:11.024725] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:41.308 [2024-04-27 06:51:11.024754] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:41.308 #24 NEW cov: 10743 ft: 17407 corp: 8/112b lim: 90 exec/s: 24 rss: 68Mb L: 12/23 MS: 1 EraseBytes- 00:08:41.308 [2024-04-27 06:51:11.203134] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:41.308 [2024-04-27 06:51:11.203163] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:41.567 #25 NEW cov: 10743 ft: 17519 corp: 9/152b lim: 90 exec/s: 25 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:41.567 [2024-04-27 06:51:11.378447] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:41.567 [2024-04-27 06:51:11.378476] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:41.827 #26 NEW cov: 10750 ft: 17544 corp: 10/165b lim: 90 exec/s: 26 rss: 68Mb L: 13/40 MS: 1 InsertByte- 00:08:41.827 [2024-04-27 06:51:11.555119] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:41.827 [2024-04-27 06:51:11.555149] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:41.827 #27 NEW cov: 10750 ft: 17666 corp: 11/180b lim: 90 exec/s: 13 rss: 68Mb L: 15/40 MS: 1 ShuffleBytes- 00:08:41.827 #27 DONE cov: 10750 ft: 17666 corp: 11/180b lim: 90 exec/s: 13 rss: 68Mb 00:08:41.827 ###### Recommended dictionary. ###### 00:08:41.827 "\377\377\377\377" # Uses: 0 00:08:41.827 ###### End of recommended dictionary. ###### 00:08:41.827 Done 27 runs in 2 second(s) 00:08:42.087 06:51:11 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:42.087 06:51:11 -- ../common.sh@72 -- # (( i++ )) 00:08:42.087 06:51:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.087 06:51:11 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:42.087 00:08:42.087 real 0m19.212s 00:08:42.087 user 0m27.160s 00:08:42.087 sys 0m1.789s 00:08:42.087 06:51:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.087 06:51:11 -- common/autotest_common.sh@10 -- # set +x 00:08:42.087 ************************************ 00:08:42.087 END TEST vfio_fuzz 00:08:42.087 ************************************ 00:08:42.087 06:51:11 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:42.087 00:08:42.087 real 1m21.908s 00:08:42.087 user 2m6.150s 00:08:42.087 sys 0m9.151s 00:08:42.087 06:51:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.087 06:51:11 -- common/autotest_common.sh@10 -- # set +x 00:08:42.087 ************************************ 00:08:42.087 END TEST llvm_fuzz 00:08:42.087 ************************************ 00:08:42.346 06:51:11 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:08:42.346 06:51:11 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:08:42.346 06:51:11 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:08:42.346 06:51:11 -- common/autotest_common.sh@712 -- # xtrace_disable 00:08:42.346 06:51:11 -- common/autotest_common.sh@10 -- # set +x 00:08:42.346 06:51:12 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:08:42.346 06:51:12 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:08:42.346 06:51:12 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:08:42.346 06:51:12 -- common/autotest_common.sh@10 -- # set +x 00:08:48.919 INFO: APP EXITING 00:08:48.919 INFO: killing all VMs 00:08:48.919 INFO: killing vhost app 00:08:48.919 INFO: EXIT DONE 00:08:51.459 Waiting for block devices as requested 00:08:51.459 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:51.459 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:51.459 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:51.459 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:51.459 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:51.459 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:51.719 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:51.719 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:51.719 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:51.719 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:51.979 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:51.979 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:51.979 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:52.239 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:52.239 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:52.239 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:52.498 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:55.788 Cleaning 00:08:55.788 Removing: /dev/shm/spdk_tgt_trace.pid2596201 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2593762 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2594990 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2596201 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2596922 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2597308 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2597620 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2598271 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2598688 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2598870 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2599154 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2599464 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2600312 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2603266 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2603578 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2603863 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2604127 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2604699 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2604715 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2605289 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2605527 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2605844 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2605873 00:08:55.788 Removing: /var/run/dpdk/spdk_pid2606161 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2606266 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2606801 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2607081 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2607214 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2607440 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2607746 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2607771 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2607843 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2608096 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2608379 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2608645 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2608935 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2609099 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2609283 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2609510 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2609801 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2610069 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2610350 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2610586 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2610786 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2610937 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2611213 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2611487 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2611768 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2612034 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2612236 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2612382 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2612631 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2612897 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2613186 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2613452 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2613727 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2613874 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2614066 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2614315 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2614596 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2614870 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2615151 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2615357 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2615549 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2615737 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2616023 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2616292 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2616584 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2616852 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2617050 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2617203 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2617451 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2617721 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2617841 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2618557 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2618856 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2619390 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2619822 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2620223 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2620766 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2621060 00:08:56.048 Removing: /var/run/dpdk/spdk_pid2621603 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2622028 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2622428 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2622971 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2623292 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2623806 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2624284 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2624642 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2625179 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2625548 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2626011 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2626548 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2626847 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2627384 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2627767 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2628216 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2628753 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2629049 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2629661 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2630197 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2630715 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2631141 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2631711 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2632686 00:08:56.307 Removing: /var/run/dpdk/spdk_pid2633223 00:08:56.307 Clean 00:08:56.307 killing process with pid 2549267 00:09:00.635 killing process with pid 2549264 00:09:00.635 killing process with pid 2549266 00:09:00.635 killing process with pid 2549265 00:09:00.635 06:51:29 -- common/autotest_common.sh@1436 -- # return 0 00:09:00.635 06:51:29 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:09:00.635 06:51:29 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:00.635 06:51:29 -- common/autotest_common.sh@10 -- # set +x 00:09:00.635 06:51:29 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:09:00.635 06:51:29 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:00.635 06:51:29 -- common/autotest_common.sh@10 -- # set +x 00:09:00.635 06:51:29 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:00.635 06:51:29 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:00.635 06:51:29 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:00.635 06:51:29 -- spdk/autotest.sh@394 -- # hash lcov 00:09:00.635 06:51:29 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:00.635 06:51:30 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:00.635 06:51:30 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:00.635 06:51:30 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:00.635 06:51:30 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:00.635 06:51:30 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:00.635 06:51:30 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:00.635 06:51:30 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:00.635 06:51:30 -- paths/export.sh@5 -- $ export PATH 00:09:00.635 06:51:30 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:00.635 06:51:30 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:00.635 06:51:30 -- common/autobuild_common.sh@435 -- $ date +%s 00:09:00.635 06:51:30 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1714193490.XXXXXX 00:09:00.635 06:51:30 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1714193490.JV6EtQ 00:09:00.635 06:51:30 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:09:00.635 06:51:30 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:09:00.635 06:51:30 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:00.635 06:51:30 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:00.635 06:51:30 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:00.635 06:51:30 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:00.636 06:51:30 -- common/autobuild_common.sh@451 -- $ get_config_params 00:09:00.636 06:51:30 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:09:00.636 06:51:30 -- common/autotest_common.sh@10 -- $ set +x 00:09:00.636 06:51:30 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:00.636 06:51:30 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:00.636 06:51:30 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:00.636 06:51:30 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:00.636 06:51:30 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:00.636 06:51:30 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:00.636 06:51:30 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:00.636 06:51:30 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:00.636 06:51:30 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:00.636 06:51:30 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:00.636 06:51:30 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:00.636 + [[ -n 2493099 ]] 00:09:00.636 + sudo kill 2493099 00:09:00.646 [Pipeline] } 00:09:00.664 [Pipeline] // stage 00:09:00.671 [Pipeline] } 00:09:00.690 [Pipeline] // timeout 00:09:00.695 [Pipeline] } 00:09:00.712 [Pipeline] // catchError 00:09:00.718 [Pipeline] } 00:09:00.736 [Pipeline] // wrap 00:09:00.742 [Pipeline] } 00:09:00.759 [Pipeline] // catchError 00:09:00.768 [Pipeline] stage 00:09:00.770 [Pipeline] { (Epilogue) 00:09:00.785 [Pipeline] catchError 00:09:00.787 [Pipeline] { 00:09:00.802 [Pipeline] echo 00:09:00.804 Cleanup processes 00:09:00.810 [Pipeline] sh 00:09:01.096 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:01.096 2642047 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:01.111 [Pipeline] sh 00:09:01.396 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:01.396 ++ grep -v 'sudo pgrep' 00:09:01.396 ++ awk '{print $1}' 00:09:01.396 + sudo kill -9 00:09:01.396 + true 00:09:01.412 [Pipeline] sh 00:09:01.701 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:01.702 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:01.702 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:02.652 [Pipeline] sh 00:09:02.936 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:02.936 Artifacts sizes are good 00:09:02.952 [Pipeline] archiveArtifacts 00:09:02.959 Archiving artifacts 00:09:03.015 [Pipeline] sh 00:09:03.300 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:03.315 [Pipeline] cleanWs 00:09:03.325 [WS-CLEANUP] Deleting project workspace... 00:09:03.325 [WS-CLEANUP] Deferred wipeout is used... 00:09:03.330 [WS-CLEANUP] done 00:09:03.332 [Pipeline] } 00:09:03.352 [Pipeline] // catchError 00:09:03.364 [Pipeline] sh 00:09:03.644 + logger -p user.info -t JENKINS-CI 00:09:03.653 [Pipeline] } 00:09:03.672 [Pipeline] // stage 00:09:03.680 [Pipeline] } 00:09:03.697 [Pipeline] // node 00:09:03.703 [Pipeline] End of Pipeline 00:09:03.746 Finished: SUCCESS