00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 601 00:00:00.001 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3266 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.002 Started by timer 00:00:00.016 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.016 The recommended git tool is: git 00:00:00.017 using credential 00000000-0000-0000-0000-000000000002 00:00:00.018 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.029 Fetching changes from the remote Git repository 00:00:00.031 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.059 Using shallow fetch with depth 1 00:00:00.059 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.059 > git --version # timeout=10 00:00:00.078 > git --version # 'git version 2.39.2' 00:00:00.078 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.104 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.104 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.936 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.946 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.956 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.956 > git config core.sparsecheckout # timeout=10 00:00:02.965 > git read-tree -mu HEAD # timeout=10 00:00:02.979 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.995 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.995 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:03.090 [Pipeline] Start of Pipeline 00:00:03.104 [Pipeline] library 00:00:03.106 Loading library shm_lib@master 00:00:03.106 Library shm_lib@master is cached. Copying from home. 00:00:03.125 [Pipeline] node 00:00:03.141 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.143 [Pipeline] { 00:00:03.153 [Pipeline] catchError 00:00:03.155 [Pipeline] { 00:00:03.171 [Pipeline] wrap 00:00:03.182 [Pipeline] { 00:00:03.189 [Pipeline] stage 00:00:03.192 [Pipeline] { (Prologue) 00:00:03.373 [Pipeline] sh 00:00:03.659 + logger -p user.info -t JENKINS-CI 00:00:03.674 [Pipeline] echo 00:00:03.675 Node: WFP20 00:00:03.683 [Pipeline] sh 00:00:03.979 [Pipeline] setCustomBuildProperty 00:00:03.990 [Pipeline] echo 00:00:03.992 Cleanup processes 00:00:03.996 [Pipeline] sh 00:00:04.273 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.274 540597 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.286 [Pipeline] sh 00:00:04.566 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.567 ++ grep -v 'sudo pgrep' 00:00:04.567 ++ awk '{print $1}' 00:00:04.567 + sudo kill -9 00:00:04.567 + true 00:00:04.579 [Pipeline] cleanWs 00:00:04.587 [WS-CLEANUP] Deleting project workspace... 00:00:04.587 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.592 [WS-CLEANUP] done 00:00:04.595 [Pipeline] setCustomBuildProperty 00:00:04.606 [Pipeline] sh 00:00:04.883 + sudo git config --global --replace-all safe.directory '*' 00:00:04.944 [Pipeline] httpRequest 00:00:04.970 [Pipeline] echo 00:00:04.971 Sorcerer 10.211.164.101 is alive 00:00:04.981 [Pipeline] httpRequest 00:00:04.985 HttpMethod: GET 00:00:04.985 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.985 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.013 Response Code: HTTP/1.1 200 OK 00:00:05.014 Success: Status code 200 is in the accepted range: 200,404 00:00:05.014 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:11.700 [Pipeline] sh 00:00:11.984 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:12.000 [Pipeline] httpRequest 00:00:12.023 [Pipeline] echo 00:00:12.024 Sorcerer 10.211.164.101 is alive 00:00:12.032 [Pipeline] httpRequest 00:00:12.036 HttpMethod: GET 00:00:12.037 URL: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:12.037 Sending request to url: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:12.061 Response Code: HTTP/1.1 200 OK 00:00:12.061 Success: Status code 200 is in the accepted range: 200,404 00:00:12.062 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:24.267 [Pipeline] sh 00:01:24.551 + tar --no-same-owner -xf spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:27.100 [Pipeline] sh 00:01:27.384 + git -C spdk log --oneline -n5 00:01:27.384 4b94202c6 lib/event: Bug fix for framework_set_scheduler 00:01:27.384 507e9ba07 nvme: add lock_depth for ctrlr_lock 00:01:27.384 62fda7b5f nvme: check pthread_mutex_destroy() return value 00:01:27.384 e03c164a1 nvme: add nvme_ctrlr_lock 00:01:27.384 d61f89a86 nvme/cuse: Add ctrlr_lock for cuse register and unregister 00:01:27.402 [Pipeline] withCredentials 00:01:27.413 > git --version # timeout=10 00:01:27.425 > git --version # 'git version 2.39.2' 00:01:27.442 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:27.445 [Pipeline] { 00:01:27.454 [Pipeline] retry 00:01:27.455 [Pipeline] { 00:01:27.472 [Pipeline] sh 00:01:27.755 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:27.766 [Pipeline] } 00:01:27.789 [Pipeline] // retry 00:01:27.794 [Pipeline] } 00:01:27.814 [Pipeline] // withCredentials 00:01:27.824 [Pipeline] httpRequest 00:01:27.842 [Pipeline] echo 00:01:27.844 Sorcerer 10.211.164.101 is alive 00:01:27.850 [Pipeline] httpRequest 00:01:27.855 HttpMethod: GET 00:01:27.855 URL: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:27.856 Sending request to url: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:27.858 Response Code: HTTP/1.1 200 OK 00:01:27.859 Success: Status code 200 is in the accepted range: 200,404 00:01:27.859 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:35.022 [Pipeline] sh 00:01:35.308 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:36.703 [Pipeline] sh 00:01:36.988 + git -C dpdk log --oneline -n5 00:01:36.988 caf0f5d395 version: 22.11.4 00:01:36.988 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:36.988 dc9c799c7d vhost: fix missing spinlock unlock 00:01:36.988 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:36.988 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:36.999 [Pipeline] } 00:01:37.015 [Pipeline] // stage 00:01:37.024 [Pipeline] stage 00:01:37.027 [Pipeline] { (Prepare) 00:01:37.051 [Pipeline] writeFile 00:01:37.070 [Pipeline] sh 00:01:37.355 + logger -p user.info -t JENKINS-CI 00:01:37.370 [Pipeline] sh 00:01:37.655 + logger -p user.info -t JENKINS-CI 00:01:37.669 [Pipeline] sh 00:01:37.953 + cat autorun-spdk.conf 00:01:37.953 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:37.953 SPDK_RUN_UBSAN=1 00:01:37.953 SPDK_TEST_FUZZER=1 00:01:37.953 SPDK_TEST_FUZZER_SHORT=1 00:01:37.953 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:37.953 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.961 RUN_NIGHTLY=1 00:01:37.966 [Pipeline] readFile 00:01:37.995 [Pipeline] withEnv 00:01:37.997 [Pipeline] { 00:01:38.011 [Pipeline] sh 00:01:38.296 + set -ex 00:01:38.296 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:38.296 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:38.296 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:38.296 ++ SPDK_RUN_UBSAN=1 00:01:38.296 ++ SPDK_TEST_FUZZER=1 00:01:38.296 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:38.296 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:38.296 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:38.296 ++ RUN_NIGHTLY=1 00:01:38.296 + case $SPDK_TEST_NVMF_NICS in 00:01:38.296 + DRIVERS= 00:01:38.296 + [[ -n '' ]] 00:01:38.296 + exit 0 00:01:38.305 [Pipeline] } 00:01:38.322 [Pipeline] // withEnv 00:01:38.327 [Pipeline] } 00:01:38.343 [Pipeline] // stage 00:01:38.352 [Pipeline] catchError 00:01:38.354 [Pipeline] { 00:01:38.369 [Pipeline] timeout 00:01:38.369 Timeout set to expire in 30 min 00:01:38.371 [Pipeline] { 00:01:38.385 [Pipeline] stage 00:01:38.387 [Pipeline] { (Tests) 00:01:38.402 [Pipeline] sh 00:01:38.726 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:38.726 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:38.726 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:38.726 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:38.726 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:38.726 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:38.726 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:38.726 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:38.726 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:38.726 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:38.726 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:38.726 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:38.726 + source /etc/os-release 00:01:38.726 ++ NAME='Fedora Linux' 00:01:38.726 ++ VERSION='38 (Cloud Edition)' 00:01:38.726 ++ ID=fedora 00:01:38.726 ++ VERSION_ID=38 00:01:38.726 ++ VERSION_CODENAME= 00:01:38.726 ++ PLATFORM_ID=platform:f38 00:01:38.726 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:38.726 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:38.726 ++ LOGO=fedora-logo-icon 00:01:38.726 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:38.726 ++ HOME_URL=https://fedoraproject.org/ 00:01:38.726 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:38.726 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:38.726 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:38.726 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:38.726 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:38.726 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:38.726 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:38.726 ++ SUPPORT_END=2024-05-14 00:01:38.726 ++ VARIANT='Cloud Edition' 00:01:38.726 ++ VARIANT_ID=cloud 00:01:38.726 + uname -a 00:01:38.726 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:38.726 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:41.265 Hugepages 00:01:41.265 node hugesize free / total 00:01:41.265 node0 1048576kB 0 / 0 00:01:41.265 node0 2048kB 0 / 0 00:01:41.265 node1 1048576kB 0 / 0 00:01:41.266 node1 2048kB 0 / 0 00:01:41.266 00:01:41.266 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:41.266 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:41.266 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:41.266 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:41.266 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:41.266 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:41.266 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:41.266 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:41.266 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:41.266 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:41.266 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:41.266 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:41.266 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:41.266 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:41.266 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:41.266 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:41.266 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:41.266 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:41.266 + rm -f /tmp/spdk-ld-path 00:01:41.266 + source autorun-spdk.conf 00:01:41.266 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:41.266 ++ SPDK_RUN_UBSAN=1 00:01:41.266 ++ SPDK_TEST_FUZZER=1 00:01:41.266 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:41.266 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:41.266 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.266 ++ RUN_NIGHTLY=1 00:01:41.266 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:41.266 + [[ -n '' ]] 00:01:41.266 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:41.525 + for M in /var/spdk/build-*-manifest.txt 00:01:41.525 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:41.525 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:41.525 + for M in /var/spdk/build-*-manifest.txt 00:01:41.525 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:41.525 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:41.525 ++ uname 00:01:41.525 + [[ Linux == \L\i\n\u\x ]] 00:01:41.525 + sudo dmesg -T 00:01:41.525 + sudo dmesg --clear 00:01:41.525 + dmesg_pid=542086 00:01:41.525 + [[ Fedora Linux == FreeBSD ]] 00:01:41.525 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:41.525 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:41.525 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:41.525 + [[ -x /usr/src/fio-static/fio ]] 00:01:41.525 + sudo dmesg -Tw 00:01:41.525 + export FIO_BIN=/usr/src/fio-static/fio 00:01:41.525 + FIO_BIN=/usr/src/fio-static/fio 00:01:41.525 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:41.525 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:41.525 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:41.525 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:41.525 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:41.525 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:41.525 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:41.525 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:41.525 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:41.525 Test configuration: 00:01:41.525 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:41.525 SPDK_RUN_UBSAN=1 00:01:41.525 SPDK_TEST_FUZZER=1 00:01:41.526 SPDK_TEST_FUZZER_SHORT=1 00:01:41.526 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:41.526 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.526 RUN_NIGHTLY=1 02:49:36 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:41.526 02:49:36 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:41.526 02:49:36 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:41.526 02:49:36 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:41.526 02:49:36 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:41.526 02:49:36 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:41.526 02:49:36 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:41.526 02:49:36 -- paths/export.sh@5 -- $ export PATH 00:01:41.526 02:49:36 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:41.526 02:49:36 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:41.526 02:49:36 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:41.526 02:49:36 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1720918176.XXXXXX 00:01:41.526 02:49:36 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1720918176.La3GBX 00:01:41.526 02:49:36 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:41.526 02:49:36 -- common/autobuild_common.sh@441 -- $ '[' -n v22.11.4 ']' 00:01:41.526 02:49:36 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.526 02:49:36 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:41.526 02:49:36 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:41.526 02:49:36 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:41.526 02:49:36 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:41.526 02:49:36 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:41.526 02:49:36 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.526 02:49:36 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:41.526 02:49:36 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:41.526 02:49:36 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:41.526 02:49:36 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:41.526 02:49:36 -- spdk/autobuild.sh@16 -- $ date -u 00:01:41.526 Sun Jul 14 12:49:36 AM UTC 2024 00:01:41.526 02:49:36 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:41.786 LTS-59-g4b94202c6 00:01:41.786 02:49:36 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:41.786 02:49:36 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:41.786 02:49:36 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:41.786 02:49:36 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:41.786 02:49:36 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:41.786 02:49:36 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.786 ************************************ 00:01:41.786 START TEST ubsan 00:01:41.786 ************************************ 00:01:41.786 02:49:36 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:41.786 using ubsan 00:01:41.786 00:01:41.786 real 0m0.000s 00:01:41.786 user 0m0.000s 00:01:41.786 sys 0m0.000s 00:01:41.786 02:49:36 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:41.786 02:49:36 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.786 ************************************ 00:01:41.786 END TEST ubsan 00:01:41.786 ************************************ 00:01:41.786 02:49:36 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:41.786 02:49:36 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:41.786 02:49:36 -- common/autobuild_common.sh@427 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:41.786 02:49:36 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:41.786 02:49:36 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:41.786 02:49:36 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.786 ************************************ 00:01:41.786 START TEST build_native_dpdk 00:01:41.786 ************************************ 00:01:41.786 02:49:36 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:01:41.786 02:49:36 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:41.786 02:49:36 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:41.786 02:49:36 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:41.786 02:49:36 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:41.786 02:49:36 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:41.786 02:49:36 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:41.786 02:49:36 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:41.786 02:49:36 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:41.786 02:49:36 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:41.786 02:49:36 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:41.786 02:49:36 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:41.786 02:49:36 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:41.786 02:49:36 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.786 02:49:36 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.786 02:49:36 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:41.786 02:49:36 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:41.786 02:49:36 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:41.786 caf0f5d395 version: 22.11.4 00:01:41.786 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:41.786 dc9c799c7d vhost: fix missing spinlock unlock 00:01:41.786 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:41.786 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:41.786 02:49:36 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:41.786 02:49:36 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:41.786 02:49:36 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:41.786 02:49:36 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:41.786 02:49:36 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:41.786 02:49:36 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:41.786 02:49:36 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:41.786 02:49:36 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:41.786 02:49:36 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:41.786 02:49:36 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:41.786 02:49:36 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:41.786 02:49:36 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:41.786 02:49:36 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:41.786 02:49:36 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:41.786 02:49:36 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:41.786 02:49:36 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:41.786 02:49:36 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:41.786 02:49:36 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:41.786 02:49:36 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:41.786 02:49:36 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:41.786 02:49:36 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:41.786 02:49:36 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:41.786 02:49:36 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:41.786 02:49:36 -- scripts/common.sh@343 -- $ case "$op" in 00:01:41.786 02:49:36 -- scripts/common.sh@344 -- $ : 1 00:01:41.786 02:49:36 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:41.786 02:49:36 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:41.786 02:49:36 -- scripts/common.sh@364 -- $ decimal 22 00:01:41.786 02:49:36 -- scripts/common.sh@352 -- $ local d=22 00:01:41.786 02:49:36 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:41.786 02:49:36 -- scripts/common.sh@354 -- $ echo 22 00:01:41.786 02:49:36 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:41.786 02:49:36 -- scripts/common.sh@365 -- $ decimal 21 00:01:41.786 02:49:36 -- scripts/common.sh@352 -- $ local d=21 00:01:41.786 02:49:36 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:41.786 02:49:36 -- scripts/common.sh@354 -- $ echo 21 00:01:41.786 02:49:36 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:41.786 02:49:36 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:41.786 02:49:36 -- scripts/common.sh@366 -- $ return 1 00:01:41.786 02:49:36 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:41.786 patching file config/rte_config.h 00:01:41.786 Hunk #1 succeeded at 60 (offset 1 line). 00:01:41.786 02:49:36 -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:01:41.786 02:49:36 -- common/autobuild_common.sh@178 -- $ uname -s 00:01:41.786 02:49:36 -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:01:41.786 02:49:36 -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:41.786 02:49:36 -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:47.066 The Meson build system 00:01:47.066 Version: 1.3.1 00:01:47.066 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:47.066 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:47.066 Build type: native build 00:01:47.066 Program cat found: YES (/usr/bin/cat) 00:01:47.066 Project name: DPDK 00:01:47.066 Project version: 22.11.4 00:01:47.067 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:47.067 C linker for the host machine: gcc ld.bfd 2.39-16 00:01:47.067 Host machine cpu family: x86_64 00:01:47.067 Host machine cpu: x86_64 00:01:47.067 Message: ## Building in Developer Mode ## 00:01:47.067 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:47.067 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:47.067 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:47.067 Program objdump found: YES (/usr/bin/objdump) 00:01:47.067 Program python3 found: YES (/usr/bin/python3) 00:01:47.067 Program cat found: YES (/usr/bin/cat) 00:01:47.067 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:47.067 Checking for size of "void *" : 8 00:01:47.067 Checking for size of "void *" : 8 (cached) 00:01:47.067 Library m found: YES 00:01:47.067 Library numa found: YES 00:01:47.067 Has header "numaif.h" : YES 00:01:47.067 Library fdt found: NO 00:01:47.067 Library execinfo found: NO 00:01:47.067 Has header "execinfo.h" : YES 00:01:47.067 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:47.067 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:47.067 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:47.067 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:47.067 Run-time dependency openssl found: YES 3.0.9 00:01:47.067 Run-time dependency libpcap found: YES 1.10.4 00:01:47.067 Has header "pcap.h" with dependency libpcap: YES 00:01:47.067 Compiler for C supports arguments -Wcast-qual: YES 00:01:47.067 Compiler for C supports arguments -Wdeprecated: YES 00:01:47.067 Compiler for C supports arguments -Wformat: YES 00:01:47.067 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:47.067 Compiler for C supports arguments -Wformat-security: NO 00:01:47.067 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:47.067 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:47.067 Compiler for C supports arguments -Wnested-externs: YES 00:01:47.067 Compiler for C supports arguments -Wold-style-definition: YES 00:01:47.067 Compiler for C supports arguments -Wpointer-arith: YES 00:01:47.067 Compiler for C supports arguments -Wsign-compare: YES 00:01:47.067 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:47.067 Compiler for C supports arguments -Wundef: YES 00:01:47.067 Compiler for C supports arguments -Wwrite-strings: YES 00:01:47.067 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:47.067 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:47.067 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:47.067 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:47.067 Compiler for C supports arguments -mavx512f: YES 00:01:47.067 Checking if "AVX512 checking" compiles: YES 00:01:47.067 Fetching value of define "__SSE4_2__" : 1 00:01:47.067 Fetching value of define "__AES__" : 1 00:01:47.067 Fetching value of define "__AVX__" : 1 00:01:47.067 Fetching value of define "__AVX2__" : 1 00:01:47.067 Fetching value of define "__AVX512BW__" : 1 00:01:47.067 Fetching value of define "__AVX512CD__" : 1 00:01:47.067 Fetching value of define "__AVX512DQ__" : 1 00:01:47.067 Fetching value of define "__AVX512F__" : 1 00:01:47.067 Fetching value of define "__AVX512VL__" : 1 00:01:47.067 Fetching value of define "__PCLMUL__" : 1 00:01:47.067 Fetching value of define "__RDRND__" : 1 00:01:47.067 Fetching value of define "__RDSEED__" : 1 00:01:47.067 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:47.067 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:47.067 Message: lib/kvargs: Defining dependency "kvargs" 00:01:47.067 Message: lib/telemetry: Defining dependency "telemetry" 00:01:47.067 Checking for function "getentropy" : YES 00:01:47.067 Message: lib/eal: Defining dependency "eal" 00:01:47.067 Message: lib/ring: Defining dependency "ring" 00:01:47.067 Message: lib/rcu: Defining dependency "rcu" 00:01:47.067 Message: lib/mempool: Defining dependency "mempool" 00:01:47.067 Message: lib/mbuf: Defining dependency "mbuf" 00:01:47.067 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:47.067 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:47.067 Compiler for C supports arguments -mpclmul: YES 00:01:47.067 Compiler for C supports arguments -maes: YES 00:01:47.067 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:47.067 Compiler for C supports arguments -mavx512bw: YES 00:01:47.067 Compiler for C supports arguments -mavx512dq: YES 00:01:47.067 Compiler for C supports arguments -mavx512vl: YES 00:01:47.067 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:47.067 Compiler for C supports arguments -mavx2: YES 00:01:47.067 Compiler for C supports arguments -mavx: YES 00:01:47.067 Message: lib/net: Defining dependency "net" 00:01:47.067 Message: lib/meter: Defining dependency "meter" 00:01:47.067 Message: lib/ethdev: Defining dependency "ethdev" 00:01:47.067 Message: lib/pci: Defining dependency "pci" 00:01:47.067 Message: lib/cmdline: Defining dependency "cmdline" 00:01:47.067 Message: lib/metrics: Defining dependency "metrics" 00:01:47.067 Message: lib/hash: Defining dependency "hash" 00:01:47.067 Message: lib/timer: Defining dependency "timer" 00:01:47.067 Fetching value of define "__AVX2__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:47.067 Message: lib/acl: Defining dependency "acl" 00:01:47.067 Message: lib/bbdev: Defining dependency "bbdev" 00:01:47.067 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:47.067 Run-time dependency libelf found: YES 0.190 00:01:47.067 Message: lib/bpf: Defining dependency "bpf" 00:01:47.067 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:47.067 Message: lib/compressdev: Defining dependency "compressdev" 00:01:47.067 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:47.067 Message: lib/distributor: Defining dependency "distributor" 00:01:47.067 Message: lib/efd: Defining dependency "efd" 00:01:47.067 Message: lib/eventdev: Defining dependency "eventdev" 00:01:47.067 Message: lib/gpudev: Defining dependency "gpudev" 00:01:47.067 Message: lib/gro: Defining dependency "gro" 00:01:47.067 Message: lib/gso: Defining dependency "gso" 00:01:47.067 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:47.067 Message: lib/jobstats: Defining dependency "jobstats" 00:01:47.067 Message: lib/latencystats: Defining dependency "latencystats" 00:01:47.067 Message: lib/lpm: Defining dependency "lpm" 00:01:47.067 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:47.067 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:47.067 Message: lib/member: Defining dependency "member" 00:01:47.067 Message: lib/pcapng: Defining dependency "pcapng" 00:01:47.067 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:47.067 Message: lib/power: Defining dependency "power" 00:01:47.067 Message: lib/rawdev: Defining dependency "rawdev" 00:01:47.067 Message: lib/regexdev: Defining dependency "regexdev" 00:01:47.067 Message: lib/dmadev: Defining dependency "dmadev" 00:01:47.067 Message: lib/rib: Defining dependency "rib" 00:01:47.067 Message: lib/reorder: Defining dependency "reorder" 00:01:47.067 Message: lib/sched: Defining dependency "sched" 00:01:47.067 Message: lib/security: Defining dependency "security" 00:01:47.067 Message: lib/stack: Defining dependency "stack" 00:01:47.067 Has header "linux/userfaultfd.h" : YES 00:01:47.067 Message: lib/vhost: Defining dependency "vhost" 00:01:47.067 Message: lib/ipsec: Defining dependency "ipsec" 00:01:47.067 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:47.067 Message: lib/fib: Defining dependency "fib" 00:01:47.067 Message: lib/port: Defining dependency "port" 00:01:47.067 Message: lib/pdump: Defining dependency "pdump" 00:01:47.067 Message: lib/table: Defining dependency "table" 00:01:47.067 Message: lib/pipeline: Defining dependency "pipeline" 00:01:47.067 Message: lib/graph: Defining dependency "graph" 00:01:47.067 Message: lib/node: Defining dependency "node" 00:01:47.067 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:47.067 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:47.067 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:47.067 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:47.067 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:47.067 Compiler for C supports arguments -Wno-unused-value: YES 00:01:47.067 Compiler for C supports arguments -Wno-format: YES 00:01:47.067 Compiler for C supports arguments -Wno-format-security: YES 00:01:47.067 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:47.067 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:47.067 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:47.067 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:47.067 Fetching value of define "__AVX2__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.067 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:47.067 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:47.067 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:47.067 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:47.067 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:47.067 Program doxygen found: YES (/usr/bin/doxygen) 00:01:47.067 Configuring doxy-api.conf using configuration 00:01:47.067 Program sphinx-build found: NO 00:01:47.067 Configuring rte_build_config.h using configuration 00:01:47.067 Message: 00:01:47.067 ================= 00:01:47.067 Applications Enabled 00:01:47.067 ================= 00:01:47.067 00:01:47.067 apps: 00:01:47.067 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:01:47.067 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:01:47.067 test-security-perf, 00:01:47.067 00:01:47.067 Message: 00:01:47.067 ================= 00:01:47.067 Libraries Enabled 00:01:47.067 ================= 00:01:47.067 00:01:47.067 libs: 00:01:47.067 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:01:47.067 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:01:47.067 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:01:47.067 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:01:47.067 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:01:47.068 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:01:47.068 table, pipeline, graph, node, 00:01:47.068 00:01:47.068 Message: 00:01:47.068 =============== 00:01:47.068 Drivers Enabled 00:01:47.068 =============== 00:01:47.068 00:01:47.068 common: 00:01:47.068 00:01:47.068 bus: 00:01:47.068 pci, vdev, 00:01:47.068 mempool: 00:01:47.068 ring, 00:01:47.068 dma: 00:01:47.068 00:01:47.068 net: 00:01:47.068 i40e, 00:01:47.068 raw: 00:01:47.068 00:01:47.068 crypto: 00:01:47.068 00:01:47.068 compress: 00:01:47.068 00:01:47.068 regex: 00:01:47.068 00:01:47.068 vdpa: 00:01:47.068 00:01:47.068 event: 00:01:47.068 00:01:47.068 baseband: 00:01:47.068 00:01:47.068 gpu: 00:01:47.068 00:01:47.068 00:01:47.068 Message: 00:01:47.068 ================= 00:01:47.068 Content Skipped 00:01:47.068 ================= 00:01:47.068 00:01:47.068 apps: 00:01:47.068 00:01:47.068 libs: 00:01:47.068 kni: explicitly disabled via build config (deprecated lib) 00:01:47.068 flow_classify: explicitly disabled via build config (deprecated lib) 00:01:47.068 00:01:47.068 drivers: 00:01:47.068 common/cpt: not in enabled drivers build config 00:01:47.068 common/dpaax: not in enabled drivers build config 00:01:47.068 common/iavf: not in enabled drivers build config 00:01:47.068 common/idpf: not in enabled drivers build config 00:01:47.068 common/mvep: not in enabled drivers build config 00:01:47.068 common/octeontx: not in enabled drivers build config 00:01:47.068 bus/auxiliary: not in enabled drivers build config 00:01:47.068 bus/dpaa: not in enabled drivers build config 00:01:47.068 bus/fslmc: not in enabled drivers build config 00:01:47.068 bus/ifpga: not in enabled drivers build config 00:01:47.068 bus/vmbus: not in enabled drivers build config 00:01:47.068 common/cnxk: not in enabled drivers build config 00:01:47.068 common/mlx5: not in enabled drivers build config 00:01:47.068 common/qat: not in enabled drivers build config 00:01:47.068 common/sfc_efx: not in enabled drivers build config 00:01:47.068 mempool/bucket: not in enabled drivers build config 00:01:47.068 mempool/cnxk: not in enabled drivers build config 00:01:47.068 mempool/dpaa: not in enabled drivers build config 00:01:47.068 mempool/dpaa2: not in enabled drivers build config 00:01:47.068 mempool/octeontx: not in enabled drivers build config 00:01:47.068 mempool/stack: not in enabled drivers build config 00:01:47.068 dma/cnxk: not in enabled drivers build config 00:01:47.068 dma/dpaa: not in enabled drivers build config 00:01:47.068 dma/dpaa2: not in enabled drivers build config 00:01:47.068 dma/hisilicon: not in enabled drivers build config 00:01:47.068 dma/idxd: not in enabled drivers build config 00:01:47.068 dma/ioat: not in enabled drivers build config 00:01:47.068 dma/skeleton: not in enabled drivers build config 00:01:47.068 net/af_packet: not in enabled drivers build config 00:01:47.068 net/af_xdp: not in enabled drivers build config 00:01:47.068 net/ark: not in enabled drivers build config 00:01:47.068 net/atlantic: not in enabled drivers build config 00:01:47.068 net/avp: not in enabled drivers build config 00:01:47.068 net/axgbe: not in enabled drivers build config 00:01:47.068 net/bnx2x: not in enabled drivers build config 00:01:47.068 net/bnxt: not in enabled drivers build config 00:01:47.068 net/bonding: not in enabled drivers build config 00:01:47.068 net/cnxk: not in enabled drivers build config 00:01:47.068 net/cxgbe: not in enabled drivers build config 00:01:47.068 net/dpaa: not in enabled drivers build config 00:01:47.068 net/dpaa2: not in enabled drivers build config 00:01:47.068 net/e1000: not in enabled drivers build config 00:01:47.068 net/ena: not in enabled drivers build config 00:01:47.068 net/enetc: not in enabled drivers build config 00:01:47.068 net/enetfec: not in enabled drivers build config 00:01:47.068 net/enic: not in enabled drivers build config 00:01:47.068 net/failsafe: not in enabled drivers build config 00:01:47.068 net/fm10k: not in enabled drivers build config 00:01:47.068 net/gve: not in enabled drivers build config 00:01:47.068 net/hinic: not in enabled drivers build config 00:01:47.068 net/hns3: not in enabled drivers build config 00:01:47.068 net/iavf: not in enabled drivers build config 00:01:47.068 net/ice: not in enabled drivers build config 00:01:47.068 net/idpf: not in enabled drivers build config 00:01:47.068 net/igc: not in enabled drivers build config 00:01:47.068 net/ionic: not in enabled drivers build config 00:01:47.068 net/ipn3ke: not in enabled drivers build config 00:01:47.068 net/ixgbe: not in enabled drivers build config 00:01:47.068 net/kni: not in enabled drivers build config 00:01:47.068 net/liquidio: not in enabled drivers build config 00:01:47.068 net/mana: not in enabled drivers build config 00:01:47.068 net/memif: not in enabled drivers build config 00:01:47.068 net/mlx4: not in enabled drivers build config 00:01:47.068 net/mlx5: not in enabled drivers build config 00:01:47.068 net/mvneta: not in enabled drivers build config 00:01:47.068 net/mvpp2: not in enabled drivers build config 00:01:47.068 net/netvsc: not in enabled drivers build config 00:01:47.068 net/nfb: not in enabled drivers build config 00:01:47.068 net/nfp: not in enabled drivers build config 00:01:47.068 net/ngbe: not in enabled drivers build config 00:01:47.068 net/null: not in enabled drivers build config 00:01:47.068 net/octeontx: not in enabled drivers build config 00:01:47.068 net/octeon_ep: not in enabled drivers build config 00:01:47.068 net/pcap: not in enabled drivers build config 00:01:47.068 net/pfe: not in enabled drivers build config 00:01:47.068 net/qede: not in enabled drivers build config 00:01:47.068 net/ring: not in enabled drivers build config 00:01:47.068 net/sfc: not in enabled drivers build config 00:01:47.068 net/softnic: not in enabled drivers build config 00:01:47.068 net/tap: not in enabled drivers build config 00:01:47.068 net/thunderx: not in enabled drivers build config 00:01:47.068 net/txgbe: not in enabled drivers build config 00:01:47.068 net/vdev_netvsc: not in enabled drivers build config 00:01:47.068 net/vhost: not in enabled drivers build config 00:01:47.068 net/virtio: not in enabled drivers build config 00:01:47.068 net/vmxnet3: not in enabled drivers build config 00:01:47.068 raw/cnxk_bphy: not in enabled drivers build config 00:01:47.068 raw/cnxk_gpio: not in enabled drivers build config 00:01:47.068 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:47.068 raw/ifpga: not in enabled drivers build config 00:01:47.068 raw/ntb: not in enabled drivers build config 00:01:47.068 raw/skeleton: not in enabled drivers build config 00:01:47.068 crypto/armv8: not in enabled drivers build config 00:01:47.068 crypto/bcmfs: not in enabled drivers build config 00:01:47.068 crypto/caam_jr: not in enabled drivers build config 00:01:47.068 crypto/ccp: not in enabled drivers build config 00:01:47.068 crypto/cnxk: not in enabled drivers build config 00:01:47.068 crypto/dpaa_sec: not in enabled drivers build config 00:01:47.068 crypto/dpaa2_sec: not in enabled drivers build config 00:01:47.068 crypto/ipsec_mb: not in enabled drivers build config 00:01:47.068 crypto/mlx5: not in enabled drivers build config 00:01:47.068 crypto/mvsam: not in enabled drivers build config 00:01:47.068 crypto/nitrox: not in enabled drivers build config 00:01:47.068 crypto/null: not in enabled drivers build config 00:01:47.068 crypto/octeontx: not in enabled drivers build config 00:01:47.068 crypto/openssl: not in enabled drivers build config 00:01:47.068 crypto/scheduler: not in enabled drivers build config 00:01:47.068 crypto/uadk: not in enabled drivers build config 00:01:47.068 crypto/virtio: not in enabled drivers build config 00:01:47.068 compress/isal: not in enabled drivers build config 00:01:47.068 compress/mlx5: not in enabled drivers build config 00:01:47.068 compress/octeontx: not in enabled drivers build config 00:01:47.068 compress/zlib: not in enabled drivers build config 00:01:47.068 regex/mlx5: not in enabled drivers build config 00:01:47.068 regex/cn9k: not in enabled drivers build config 00:01:47.068 vdpa/ifc: not in enabled drivers build config 00:01:47.068 vdpa/mlx5: not in enabled drivers build config 00:01:47.068 vdpa/sfc: not in enabled drivers build config 00:01:47.068 event/cnxk: not in enabled drivers build config 00:01:47.068 event/dlb2: not in enabled drivers build config 00:01:47.068 event/dpaa: not in enabled drivers build config 00:01:47.068 event/dpaa2: not in enabled drivers build config 00:01:47.068 event/dsw: not in enabled drivers build config 00:01:47.068 event/opdl: not in enabled drivers build config 00:01:47.068 event/skeleton: not in enabled drivers build config 00:01:47.068 event/sw: not in enabled drivers build config 00:01:47.068 event/octeontx: not in enabled drivers build config 00:01:47.068 baseband/acc: not in enabled drivers build config 00:01:47.068 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:47.068 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:47.068 baseband/la12xx: not in enabled drivers build config 00:01:47.068 baseband/null: not in enabled drivers build config 00:01:47.068 baseband/turbo_sw: not in enabled drivers build config 00:01:47.068 gpu/cuda: not in enabled drivers build config 00:01:47.068 00:01:47.068 00:01:47.068 Build targets in project: 311 00:01:47.068 00:01:47.068 DPDK 22.11.4 00:01:47.068 00:01:47.068 User defined options 00:01:47.068 libdir : lib 00:01:47.068 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:47.068 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:47.068 c_link_args : 00:01:47.068 enable_docs : false 00:01:47.068 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:47.068 enable_kmods : false 00:01:47.068 machine : native 00:01:47.068 tests : false 00:01:47.068 00:01:47.068 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:47.068 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:47.333 02:49:42 -- common/autobuild_common.sh@186 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:47.333 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:47.333 [1/740] Generating lib/rte_kvargs_def with a custom command 00:01:47.333 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:01:47.333 [3/740] Generating lib/rte_telemetry_def with a custom command 00:01:47.333 [4/740] Generating lib/rte_telemetry_mingw with a custom command 00:01:47.333 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:47.333 [6/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:47.333 [7/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:47.333 [8/740] Generating lib/rte_rcu_def with a custom command 00:01:47.333 [9/740] Generating lib/rte_rcu_mingw with a custom command 00:01:47.333 [10/740] Generating lib/rte_mempool_def with a custom command 00:01:47.333 [11/740] Generating lib/rte_mbuf_def with a custom command 00:01:47.333 [12/740] Generating lib/rte_eal_def with a custom command 00:01:47.333 [13/740] Generating lib/rte_eal_mingw with a custom command 00:01:47.333 [14/740] Generating lib/rte_ring_def with a custom command 00:01:47.333 [15/740] Generating lib/rte_ring_mingw with a custom command 00:01:47.333 [16/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:47.333 [17/740] Generating lib/rte_mbuf_mingw with a custom command 00:01:47.333 [18/740] Generating lib/rte_net_mingw with a custom command 00:01:47.333 [19/740] Generating lib/rte_meter_mingw with a custom command 00:01:47.333 [20/740] Generating lib/rte_mempool_mingw with a custom command 00:01:47.333 [21/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:47.333 [22/740] Generating lib/rte_net_def with a custom command 00:01:47.333 [23/740] Generating lib/rte_meter_def with a custom command 00:01:47.596 [24/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:47.596 [25/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:47.596 [26/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:47.596 [27/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:47.596 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:47.596 [29/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:01:47.596 [30/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:47.596 [31/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:47.596 [32/740] Linking static target lib/librte_kvargs.a 00:01:47.596 [33/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:47.596 [34/740] Generating lib/rte_ethdev_mingw with a custom command 00:01:47.596 [35/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:47.596 [36/740] Generating lib/rte_ethdev_def with a custom command 00:01:47.596 [37/740] Generating lib/rte_pci_def with a custom command 00:01:47.596 [38/740] Generating lib/rte_pci_mingw with a custom command 00:01:47.596 [39/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:47.596 [40/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:47.596 [41/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:47.596 [42/740] Generating lib/rte_cmdline_def with a custom command 00:01:47.596 [43/740] Generating lib/rte_cmdline_mingw with a custom command 00:01:47.596 [44/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:47.596 [45/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:47.596 [46/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:47.596 [47/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:47.596 [48/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:47.596 [49/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:47.596 [50/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:47.596 [51/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:47.596 [52/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:47.596 [53/740] Generating lib/rte_metrics_def with a custom command 00:01:47.596 [54/740] Generating lib/rte_hash_def with a custom command 00:01:47.596 [55/740] Generating lib/rte_hash_mingw with a custom command 00:01:47.596 [56/740] Generating lib/rte_metrics_mingw with a custom command 00:01:47.596 [57/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:47.596 [58/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:47.596 [59/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:47.596 [60/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:47.596 [61/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:47.596 [62/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:47.596 [63/740] Generating lib/rte_timer_def with a custom command 00:01:47.596 [64/740] Generating lib/rte_timer_mingw with a custom command 00:01:47.596 [65/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:47.596 [66/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:47.596 [67/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:47.596 [68/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:47.596 [69/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:47.596 [70/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:47.596 [71/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:47.596 [72/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:47.596 [73/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:47.596 [74/740] Generating lib/rte_acl_def with a custom command 00:01:47.596 [75/740] Generating lib/rte_bitratestats_def with a custom command 00:01:47.596 [76/740] Generating lib/rte_acl_mingw with a custom command 00:01:47.596 [77/740] Generating lib/rte_bbdev_def with a custom command 00:01:47.596 [78/740] Generating lib/rte_bbdev_mingw with a custom command 00:01:47.596 [79/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:47.596 [80/740] Generating lib/rte_bitratestats_mingw with a custom command 00:01:47.596 [81/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:47.596 [82/740] Linking static target lib/librte_pci.a 00:01:47.596 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:47.596 [84/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:47.596 [85/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:47.596 [86/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:47.596 [87/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:47.596 [88/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:47.596 [89/740] Generating lib/rte_bpf_def with a custom command 00:01:47.596 [90/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:47.596 [91/740] Linking static target lib/librte_meter.a 00:01:47.596 [92/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:47.596 [93/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:47.596 [94/740] Generating lib/rte_bpf_mingw with a custom command 00:01:47.596 [95/740] Generating lib/rte_cfgfile_def with a custom command 00:01:47.596 [96/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:47.596 [97/740] Generating lib/rte_compressdev_def with a custom command 00:01:47.596 [98/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:01:47.596 [99/740] Generating lib/rte_cfgfile_mingw with a custom command 00:01:47.596 [100/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:47.596 [101/740] Generating lib/rte_compressdev_mingw with a custom command 00:01:47.596 [102/740] Linking static target lib/librte_ring.a 00:01:47.596 [103/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:47.596 [104/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:47.861 [105/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:47.861 [106/740] Generating lib/rte_cryptodev_def with a custom command 00:01:47.861 [107/740] Generating lib/rte_cryptodev_mingw with a custom command 00:01:47.861 [108/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:47.861 [109/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:47.861 [110/740] Generating lib/rte_distributor_def with a custom command 00:01:47.861 [111/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:47.861 [112/740] Generating lib/rte_efd_mingw with a custom command 00:01:47.861 [113/740] Generating lib/rte_efd_def with a custom command 00:01:47.861 [114/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:47.861 [115/740] Generating lib/rte_distributor_mingw with a custom command 00:01:47.861 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:47.861 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:47.861 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:47.861 [119/740] Generating lib/rte_eventdev_def with a custom command 00:01:47.861 [120/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:47.861 [121/740] Generating lib/rte_eventdev_mingw with a custom command 00:01:47.861 [122/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:47.861 [123/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:47.861 [124/740] Generating lib/rte_gpudev_def with a custom command 00:01:47.861 [125/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:47.861 [126/740] Generating lib/rte_gpudev_mingw with a custom command 00:01:47.861 [127/740] Generating lib/rte_gro_def with a custom command 00:01:47.861 [128/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:47.861 [129/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:47.861 [130/740] Generating lib/rte_gro_mingw with a custom command 00:01:47.861 [131/740] Generating lib/rte_gso_def with a custom command 00:01:47.861 [132/740] Generating lib/rte_gso_mingw with a custom command 00:01:47.861 [133/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:47.861 [134/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:47.861 [135/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.861 [136/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:48.128 [137/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.128 [138/740] Generating lib/rte_ip_frag_def with a custom command 00:01:48.128 [139/740] Generating lib/rte_ip_frag_mingw with a custom command 00:01:48.128 [140/740] Linking target lib/librte_kvargs.so.23.0 00:01:48.128 [141/740] Generating lib/rte_jobstats_def with a custom command 00:01:48.128 [142/740] Generating lib/rte_jobstats_mingw with a custom command 00:01:48.128 [143/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:48.128 [144/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:48.128 [145/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:48.128 [146/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.128 [147/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:48.128 [148/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:48.128 [149/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:48.128 [150/740] Generating lib/rte_latencystats_def with a custom command 00:01:48.128 [151/740] Generating lib/rte_latencystats_mingw with a custom command 00:01:48.128 [152/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:48.128 [153/740] Linking static target lib/librte_cfgfile.a 00:01:48.128 [154/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:48.128 [155/740] Generating lib/rte_lpm_def with a custom command 00:01:48.128 [156/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:48.128 [157/740] Generating lib/rte_lpm_mingw with a custom command 00:01:48.128 [158/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.128 [159/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:48.128 [160/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:48.128 [161/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:48.128 [162/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:48.128 [163/740] Generating lib/rte_member_def with a custom command 00:01:48.128 [164/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:48.128 [165/740] Generating lib/rte_member_mingw with a custom command 00:01:48.128 [166/740] Generating lib/rte_pcapng_def with a custom command 00:01:48.128 [167/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:48.128 [168/740] Generating lib/rte_pcapng_mingw with a custom command 00:01:48.128 [169/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:48.128 [170/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:48.128 [171/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:01:48.128 [172/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:48.128 [173/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:48.128 [174/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:48.128 [175/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:48.128 [176/740] Linking static target lib/librte_jobstats.a 00:01:48.128 [177/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:48.128 [178/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:48.128 [179/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:48.128 [180/740] Linking static target lib/librte_cmdline.a 00:01:48.128 [181/740] Linking static target lib/librte_timer.a 00:01:48.128 [182/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:48.128 [183/740] Generating lib/rte_power_def with a custom command 00:01:48.128 [184/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:48.128 [185/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:48.128 [186/740] Generating lib/rte_power_mingw with a custom command 00:01:48.128 [187/740] Generating lib/rte_rawdev_def with a custom command 00:01:48.128 [188/740] Generating lib/rte_rawdev_mingw with a custom command 00:01:48.128 [189/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:48.128 [190/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:48.128 [191/740] Linking static target lib/librte_telemetry.a 00:01:48.390 [192/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:48.390 [193/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:48.390 [194/740] Generating lib/rte_regexdev_def with a custom command 00:01:48.390 [195/740] Generating lib/rte_regexdev_mingw with a custom command 00:01:48.390 [196/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:48.390 [197/740] Generating lib/rte_dmadev_mingw with a custom command 00:01:48.390 [198/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:48.390 [199/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:48.390 [200/740] Generating lib/rte_dmadev_def with a custom command 00:01:48.390 [201/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:48.390 [202/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:48.390 [203/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:48.390 [204/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:48.390 [205/740] Linking static target lib/librte_metrics.a 00:01:48.390 [206/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:48.390 [207/740] Generating lib/rte_rib_def with a custom command 00:01:48.390 [208/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:48.390 [209/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:48.390 [210/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:48.390 [211/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:48.390 [212/740] Linking static target lib/librte_net.a 00:01:48.390 [213/740] Generating lib/rte_reorder_def with a custom command 00:01:48.390 [214/740] Generating lib/rte_rib_mingw with a custom command 00:01:48.390 [215/740] Generating lib/rte_reorder_mingw with a custom command 00:01:48.390 [216/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:48.390 [217/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:48.390 [218/740] Generating lib/rte_sched_def with a custom command 00:01:48.390 [219/740] Generating lib/rte_sched_mingw with a custom command 00:01:48.390 [220/740] Generating lib/rte_security_def with a custom command 00:01:48.390 [221/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:48.390 [222/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:48.390 [223/740] Generating lib/rte_security_mingw with a custom command 00:01:48.390 [224/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:48.390 [225/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:48.390 [226/740] Linking static target lib/librte_bitratestats.a 00:01:48.390 [227/740] Generating lib/rte_stack_mingw with a custom command 00:01:48.390 [228/740] Generating lib/rte_stack_def with a custom command 00:01:48.390 [229/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:48.390 [230/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:48.390 [231/740] Generating lib/rte_vhost_mingw with a custom command 00:01:48.390 [232/740] Generating lib/rte_vhost_def with a custom command 00:01:48.390 [233/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:48.390 [234/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:48.390 [235/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:48.390 [236/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:48.390 [237/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:48.391 [238/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:48.391 [239/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:48.391 [240/740] Generating lib/rte_ipsec_def with a custom command 00:01:48.391 [241/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:48.391 [242/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:48.391 [243/740] Generating lib/rte_ipsec_mingw with a custom command 00:01:48.391 [244/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:48.391 [245/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:48.391 [246/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:48.391 [247/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:48.391 [248/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:48.391 [249/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:48.391 [250/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:48.391 [251/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:01:48.391 [252/740] Generating lib/rte_fib_def with a custom command 00:01:48.391 [253/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:48.391 [254/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:48.657 [255/740] Generating lib/rte_fib_mingw with a custom command 00:01:48.657 [256/740] Linking static target lib/librte_stack.a 00:01:48.657 [257/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:48.657 [258/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:48.657 [259/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:48.657 [260/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:48.657 [261/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:48.657 [262/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:48.657 [263/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:48.657 [264/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:48.657 [265/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.657 [266/740] Generating lib/rte_pdump_def with a custom command 00:01:48.657 [267/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:48.657 [268/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:48.657 [269/740] Generating lib/rte_port_def with a custom command 00:01:48.657 [270/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:48.657 [271/740] Generating lib/rte_port_mingw with a custom command 00:01:48.657 [272/740] Generating lib/rte_pdump_mingw with a custom command 00:01:48.657 [273/740] Linking static target lib/librte_compressdev.a 00:01:48.657 [274/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:48.657 [275/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:48.657 [276/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.657 [277/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:48.657 [278/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:48.657 [279/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.657 [280/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:48.657 [281/740] Linking static target lib/librte_rcu.a 00:01:48.657 [282/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:48.657 [283/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:48.657 [284/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.657 [285/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:48.657 [286/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:48.657 [287/740] Linking static target lib/librte_rawdev.a 00:01:48.657 [288/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:48.657 [289/740] Linking static target lib/librte_mempool.a 00:01:48.657 [290/740] Generating lib/rte_table_mingw with a custom command 00:01:48.657 [291/740] Generating lib/rte_table_def with a custom command 00:01:48.657 [292/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:48.657 [293/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:48.657 [294/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:48.657 [295/740] Linking static target lib/librte_bbdev.a 00:01:48.657 [296/740] Linking static target lib/librte_gro.a 00:01:48.657 [297/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:48.657 [298/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.921 [299/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:48.921 [300/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:48.921 [301/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.921 [302/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:48.921 [303/740] Linking static target lib/librte_gpudev.a 00:01:48.921 [304/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:48.921 [305/740] Linking static target lib/librte_dmadev.a 00:01:48.921 [306/740] Generating lib/rte_pipeline_def with a custom command 00:01:48.921 [307/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:48.921 [308/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:48.921 [309/740] Generating lib/rte_pipeline_mingw with a custom command 00:01:48.921 [310/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:48.921 [311/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:01:48.921 [312/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.921 [313/740] Linking target lib/librte_telemetry.so.23.0 00:01:48.921 [314/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:48.921 [315/740] Linking static target lib/librte_latencystats.a 00:01:48.921 [316/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:48.921 [317/740] Linking static target lib/librte_gso.a 00:01:48.921 [318/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:48.921 [319/740] Generating lib/rte_graph_mingw with a custom command 00:01:48.921 [320/740] Generating lib/rte_graph_def with a custom command 00:01:48.921 [321/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:48.921 [322/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:48.921 [323/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.921 [324/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:48.921 [325/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:48.921 [326/740] Linking static target lib/librte_distributor.a 00:01:48.921 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:48.921 [328/740] Linking static target lib/librte_ip_frag.a 00:01:48.921 [329/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:48.921 [330/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:48.921 [331/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:48.921 [332/740] Linking static target lib/librte_regexdev.a 00:01:49.184 [333/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:49.184 [334/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:01:49.184 [335/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:49.184 [336/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:49.184 [337/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:49.184 [338/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:49.184 [339/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:49.184 [340/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:49.184 [341/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:49.184 [342/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:49.184 [343/740] Generating lib/rte_node_mingw with a custom command 00:01:49.184 [344/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.184 [345/740] Generating lib/rte_node_def with a custom command 00:01:49.184 [346/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:49.184 [347/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:49.184 [348/740] Linking static target lib/librte_eal.a 00:01:49.184 [349/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.184 [350/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:49.184 [351/740] Generating drivers/rte_bus_pci_def with a custom command 00:01:49.184 [352/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:01:49.184 [353/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:49.184 [354/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.184 [355/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:49.184 [356/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.184 [357/740] Generating drivers/rte_bus_vdev_def with a custom command 00:01:49.184 [358/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:01:49.184 [359/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:49.184 [360/740] Linking static target lib/librte_power.a 00:01:49.184 [361/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:49.184 [362/740] Linking static target lib/librte_reorder.a 00:01:49.184 [363/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:49.184 [364/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:01:49.184 [365/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:49.184 [366/740] Generating drivers/rte_mempool_ring_def with a custom command 00:01:49.184 [367/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:49.184 [368/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:49.184 [369/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:49.184 [370/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:49.184 [371/740] Linking static target lib/librte_security.a 00:01:49.184 [372/740] Linking static target lib/librte_pcapng.a 00:01:49.184 [373/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:49.446 [374/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.446 [375/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:49.446 [376/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:49.446 [377/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:49.446 [378/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:49.446 [379/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.446 [380/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:49.446 [381/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:49.446 [382/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:49.446 [383/740] Linking static target lib/librte_mbuf.a 00:01:49.446 [384/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:49.446 [385/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:49.446 [386/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.446 [387/740] Linking static target lib/librte_bpf.a 00:01:49.446 [388/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:49.446 [389/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:49.446 [390/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:49.446 [391/740] Generating drivers/rte_net_i40e_def with a custom command 00:01:49.446 [392/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:01:49.446 [393/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:49.446 [394/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:49.446 [395/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:49.446 [396/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:49.446 [397/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:49.446 [398/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:49.446 [399/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:49.446 [400/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:49.446 [401/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:49.446 [402/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:49.446 [403/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:49.713 [404/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:49.713 [405/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:49.713 [406/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:49.713 [407/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:49.713 [408/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:49.713 [409/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.713 [410/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:49.713 [411/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:49.713 [412/740] Linking static target lib/librte_rib.a 00:01:49.713 [413/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:49.713 [414/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:49.713 [415/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.713 [416/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:49.713 [417/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:49.713 [418/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:49.713 [419/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:49.713 [420/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:49.713 [421/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:49.713 [422/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:49.713 [423/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:49.713 [424/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.713 [425/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:49.713 [426/740] Linking static target lib/librte_lpm.a 00:01:49.713 [427/740] Linking static target lib/librte_graph.a 00:01:49.713 [428/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.713 [429/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:49.713 [430/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:49.713 [431/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:49.713 [432/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:49.713 [433/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:49.713 [434/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:49.713 [435/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:49.713 [436/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.713 [437/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:49.713 [438/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:49.713 [439/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:49.713 [440/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:49.713 [441/740] Linking static target lib/librte_efd.a 00:01:49.713 [442/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:49.975 [443/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:49.975 [444/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:49.975 [445/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:49.975 [446/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:49.975 [447/740] Linking static target drivers/librte_bus_vdev.a 00:01:49.975 [448/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:49.975 [449/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.975 [450/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:49.975 [451/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:49.975 [452/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:49.975 [453/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.975 [454/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.975 [455/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:49.975 [456/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:49.975 [457/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:49.975 [458/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.975 [459/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:49.975 [460/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:49.975 [461/740] Linking static target lib/librte_fib.a 00:01:50.238 [462/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:50.238 [463/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.238 [464/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:50.238 [465/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.238 [466/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:50.238 [467/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.238 [468/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:50.238 [469/740] Linking static target lib/librte_pdump.a 00:01:50.238 [470/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:50.238 [471/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:50.238 [472/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:50.238 [473/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.238 [474/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.238 [475/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:50.238 [476/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.238 [477/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:50.238 [478/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:50.238 [479/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:50.238 [480/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:50.238 [481/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:50.238 [482/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.238 [483/740] Linking static target drivers/librte_bus_pci.a 00:01:50.238 [484/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:50.502 [485/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:50.503 [486/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:50.503 [487/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:50.503 [488/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:50.503 [489/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:50.503 [490/740] Linking static target lib/librte_table.a 00:01:50.503 [491/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:50.503 [492/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:50.503 [493/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:50.503 [494/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:50.503 [495/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:50.503 [496/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:50.503 [497/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:50.503 [498/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:50.503 [499/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:50.503 [500/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:50.503 [501/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:50.761 [502/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:50.761 [503/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:50.761 [504/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:50.761 [505/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.761 [506/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.761 [507/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:50.761 [508/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:50.761 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:50.761 [510/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:50.761 [511/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:50.761 [512/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:50.761 [513/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:50.761 [514/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.761 [515/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:50.761 [516/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:50.761 [517/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:50.761 [518/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:50.761 [519/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:50.761 [520/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:50.761 [521/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:50.761 [522/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:50.761 [523/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:50.761 [524/740] Linking static target lib/librte_cryptodev.a 00:01:50.761 [525/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:50.761 [526/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:50.761 [527/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:50.761 [528/740] Linking static target lib/librte_sched.a 00:01:50.761 [529/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.761 [530/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:50.761 [531/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:50.761 [532/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:50.761 [533/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:50.761 [534/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:50.761 [535/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:50.761 [536/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:51.020 [537/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:51.020 [538/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:51.020 [539/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:51.020 [540/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:51.020 [541/740] Linking static target lib/librte_node.a 00:01:51.020 [542/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:51.020 [543/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.020 [544/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:51.020 [545/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:51.020 [546/740] Linking static target drivers/librte_mempool_ring.a 00:01:51.020 [547/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:51.020 [548/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:51.020 [549/740] Linking static target lib/librte_ipsec.a 00:01:51.020 [550/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:51.020 [551/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:51.020 [552/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:51.020 [553/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:51.020 [554/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:51.020 [555/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:51.020 [556/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:51.020 [557/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:51.020 [558/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:51.020 [559/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:51.020 [560/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:51.020 [561/740] Linking static target lib/librte_ethdev.a 00:01:51.020 [562/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:51.020 [563/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:51.278 [564/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:51.278 [565/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:51.278 [566/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:51.278 [567/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:51.278 [568/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:51.278 [569/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:51.278 [570/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:51.278 [571/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.278 [572/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:51.278 [573/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:51.278 [574/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:51.278 [575/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.278 [576/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:51.278 [577/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:51.278 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:51.278 [579/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:51.278 [580/740] Linking static target lib/librte_member.a 00:01:51.278 [581/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:51.278 [582/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:51.278 [583/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:51.278 [584/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:51.278 [585/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:51.278 [586/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:51.278 [587/740] Linking static target lib/librte_port.a 00:01:51.278 [588/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:51.537 [589/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:51.537 [590/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.537 [591/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:51.537 [592/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:51.537 [593/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.537 [594/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:51.537 [595/740] Linking static target lib/librte_eventdev.a 00:01:51.537 [596/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:51.537 [597/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:51.537 [598/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:01:51.537 [599/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:51.537 [600/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:51.537 [601/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:51.537 [602/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:51.537 [603/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:51.537 [604/740] Linking static target lib/librte_hash.a 00:01:51.795 [605/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:01:51.795 [606/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:51.795 [607/740] Linking static target lib/librte_acl.a 00:01:51.795 [608/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.795 [609/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:51.795 [610/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:51.795 [611/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:52.052 [612/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:52.052 [613/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:52.309 [614/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.309 [615/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.309 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:52.566 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:52.823 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.823 [619/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:53.080 [620/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:53.080 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:53.678 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:53.678 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:53.936 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:53.936 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:53.936 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:54.195 [627/740] Linking static target drivers/librte_net_i40e.a 00:01:54.452 [628/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.452 [629/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:54.710 [630/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:54.710 [631/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:54.710 [632/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.277 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.541 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.541 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:00.541 [636/740] Linking static target lib/librte_vhost.a 00:02:01.108 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:01.367 [638/740] Linking static target lib/librte_pipeline.a 00:02:01.626 [639/740] Linking target app/dpdk-test-fib 00:02:01.626 [640/740] Linking target app/dpdk-test-compress-perf 00:02:01.626 [641/740] Linking target app/dpdk-pdump 00:02:01.626 [642/740] Linking target app/dpdk-test-cmdline 00:02:01.626 [643/740] Linking target app/dpdk-dumpcap 00:02:01.626 [644/740] Linking target app/dpdk-test-gpudev 00:02:01.626 [645/740] Linking target app/dpdk-test-regex 00:02:01.626 [646/740] Linking target app/dpdk-proc-info 00:02:01.626 [647/740] Linking target app/dpdk-test-sad 00:02:01.626 [648/740] Linking target app/dpdk-test-acl 00:02:01.626 [649/740] Linking target app/dpdk-test-pipeline 00:02:01.626 [650/740] Linking target app/dpdk-test-security-perf 00:02:01.626 [651/740] Linking target app/dpdk-test-flow-perf 00:02:01.626 [652/740] Linking target app/dpdk-test-crypto-perf 00:02:01.626 [653/740] Linking target app/dpdk-test-bbdev 00:02:01.626 [654/740] Linking target app/dpdk-test-eventdev 00:02:01.626 [655/740] Linking target app/dpdk-testpmd 00:02:02.562 [656/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.562 [657/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.562 [658/740] Linking target lib/librte_eal.so.23.0 00:02:02.562 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:02:02.821 [660/740] Linking target lib/librte_meter.so.23.0 00:02:02.821 [661/740] Linking target drivers/librte_bus_vdev.so.23.0 00:02:02.821 [662/740] Linking target lib/librte_pci.so.23.0 00:02:02.821 [663/740] Linking target lib/librte_ring.so.23.0 00:02:02.821 [664/740] Linking target lib/librte_cfgfile.so.23.0 00:02:02.821 [665/740] Linking target lib/librte_timer.so.23.0 00:02:02.821 [666/740] Linking target lib/librte_jobstats.so.23.0 00:02:02.821 [667/740] Linking target lib/librte_dmadev.so.23.0 00:02:02.821 [668/740] Linking target lib/librte_acl.so.23.0 00:02:02.821 [669/740] Linking target lib/librte_rawdev.so.23.0 00:02:02.821 [670/740] Linking target lib/librte_stack.so.23.0 00:02:02.821 [671/740] Linking target lib/librte_graph.so.23.0 00:02:02.821 [672/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:02:02.821 [673/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:02:02.821 [674/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:02:02.821 [675/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:02:02.821 [676/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:02:02.821 [677/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:02:02.821 [678/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:02:02.821 [679/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:02:02.821 [680/740] Linking target drivers/librte_bus_pci.so.23.0 00:02:02.821 [681/740] Linking target lib/librte_rcu.so.23.0 00:02:02.821 [682/740] Linking target lib/librte_mempool.so.23.0 00:02:03.080 [683/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:02:03.080 [684/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:02:03.080 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:02:03.080 [686/740] Linking target drivers/librte_mempool_ring.so.23.0 00:02:03.080 [687/740] Linking target lib/librte_rib.so.23.0 00:02:03.080 [688/740] Linking target lib/librte_mbuf.so.23.0 00:02:03.339 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:02:03.339 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:02:03.339 [691/740] Linking target lib/librte_fib.so.23.0 00:02:03.339 [692/740] Linking target lib/librte_compressdev.so.23.0 00:02:03.339 [693/740] Linking target lib/librte_bbdev.so.23.0 00:02:03.339 [694/740] Linking target lib/librte_reorder.so.23.0 00:02:03.339 [695/740] Linking target lib/librte_net.so.23.0 00:02:03.339 [696/740] Linking target lib/librte_cryptodev.so.23.0 00:02:03.339 [697/740] Linking target lib/librte_distributor.so.23.0 00:02:03.339 [698/740] Linking target lib/librte_gpudev.so.23.0 00:02:03.339 [699/740] Linking target lib/librte_regexdev.so.23.0 00:02:03.339 [700/740] Linking target lib/librte_sched.so.23.0 00:02:03.339 [701/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:02:03.339 [702/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:02:03.339 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:02:03.598 [704/740] Linking target lib/librte_cmdline.so.23.0 00:02:03.598 [705/740] Linking target lib/librte_security.so.23.0 00:02:03.598 [706/740] Linking target lib/librte_hash.so.23.0 00:02:03.598 [707/740] Linking target lib/librte_ethdev.so.23.0 00:02:03.598 [708/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:02:03.598 [709/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:02:03.598 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:02:03.598 [711/740] Linking target lib/librte_bpf.so.23.0 00:02:03.598 [712/740] Linking target lib/librte_pcapng.so.23.0 00:02:03.598 [713/740] Linking target lib/librte_efd.so.23.0 00:02:03.598 [714/740] Linking target lib/librte_ipsec.so.23.0 00:02:03.598 [715/740] Linking target lib/librte_lpm.so.23.0 00:02:03.598 [716/740] Linking target lib/librte_gro.so.23.0 00:02:03.598 [717/740] Linking target lib/librte_metrics.so.23.0 00:02:03.598 [718/740] Linking target lib/librte_member.so.23.0 00:02:03.598 [719/740] Linking target lib/librte_ip_frag.so.23.0 00:02:03.598 [720/740] Linking target lib/librte_gso.so.23.0 00:02:03.598 [721/740] Linking target lib/librte_power.so.23.0 00:02:03.598 [722/740] Linking target lib/librte_eventdev.so.23.0 00:02:03.598 [723/740] Linking target lib/librte_vhost.so.23.0 00:02:03.856 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:02:03.856 [725/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:02:03.856 [726/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:02:03.856 [727/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:02:03.856 [728/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:02:03.856 [729/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:02:03.856 [730/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:02:03.856 [731/740] Linking target lib/librte_node.so.23.0 00:02:03.856 [732/740] Linking target lib/librte_latencystats.so.23.0 00:02:03.856 [733/740] Linking target lib/librte_bitratestats.so.23.0 00:02:03.856 [734/740] Linking target lib/librte_pdump.so.23.0 00:02:03.856 [735/740] Linking target lib/librte_port.so.23.0 00:02:04.115 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:02:04.115 [737/740] Linking target lib/librte_table.so.23.0 00:02:04.115 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:06.016 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.016 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:06.016 02:50:01 -- common/autobuild_common.sh@187 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:06.016 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:06.016 [0/1] Installing files. 00:02:06.277 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:06.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.281 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.282 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:06.283 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:06.283 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.283 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:06.545 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:06.545 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:06.545 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:06.545 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:06.545 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.545 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.546 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:06.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:06.548 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:06.548 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:06.548 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:06.548 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:06.549 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:06.549 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:06.549 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:06.549 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:06.549 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:06.549 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:06.549 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:06.549 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:06.549 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:06.549 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:06.549 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:06.549 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:06.549 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:06.549 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:06.549 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:06.549 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:06.549 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:06.549 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:06.549 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:06.549 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:06.549 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:06.549 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:06.549 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:06.549 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:06.549 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:06.549 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:06.549 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:06.549 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:06.549 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:06.549 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:06.549 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:06.549 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:06.549 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:06.549 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:06.549 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:06.549 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:06.549 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:06.549 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:06.549 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:06.549 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:06.549 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:06.549 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:06.549 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:06.549 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:06.549 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:06.549 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:06.549 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:06.549 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:06.549 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:06.549 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:06.549 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:06.549 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:06.549 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:06.549 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:06.549 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:06.549 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:06.549 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:06.549 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:06.549 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:06.549 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:06.549 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:06.549 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:06.549 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:06.549 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:06.549 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:06.549 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:06.549 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:06.549 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:06.549 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:06.549 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:06.549 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:06.549 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:06.549 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:06.549 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:06.549 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:06.549 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:06.549 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:06.549 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:06.549 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:06.549 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:06.549 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:06.549 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:06.549 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:06.550 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:06.550 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:06.550 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:06.550 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:06.550 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:06.550 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:06.550 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:06.550 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:06.550 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:06.550 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:06.550 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:06.550 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:06.550 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:06.550 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:06.550 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:06.550 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:06.550 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:06.550 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:06.550 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:06.550 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:06.550 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:06.550 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:06.550 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:06.550 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:06.550 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:06.550 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:06.550 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:06.550 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:06.550 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:06.550 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:06.550 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:06.550 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:06.550 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:06.550 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:06.550 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:06.550 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:06.550 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:06.550 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:06.550 02:50:01 -- common/autobuild_common.sh@189 -- $ uname -s 00:02:06.550 02:50:01 -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:06.550 02:50:01 -- common/autobuild_common.sh@200 -- $ cat 00:02:06.550 02:50:01 -- common/autobuild_common.sh@205 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:06.550 00:02:06.550 real 0m24.924s 00:02:06.550 user 6m31.654s 00:02:06.550 sys 2m11.172s 00:02:06.550 02:50:01 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:06.550 02:50:01 -- common/autotest_common.sh@10 -- $ set +x 00:02:06.550 ************************************ 00:02:06.550 END TEST build_native_dpdk 00:02:06.550 ************************************ 00:02:06.808 02:50:01 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:06.808 02:50:01 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:06.808 02:50:01 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:06.808 02:50:01 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:06.808 02:50:01 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:06.808 02:50:01 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:02:06.808 02:50:01 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:06.808 02:50:01 -- common/autotest_common.sh@10 -- $ set +x 00:02:06.808 ************************************ 00:02:06.808 START TEST autobuild_llvm_precompile 00:02:06.808 ************************************ 00:02:06.808 02:50:01 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:02:06.808 02:50:01 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:06.808 02:50:01 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:02:06.808 Target: x86_64-redhat-linux-gnu 00:02:06.808 Thread model: posix 00:02:06.808 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:06.808 02:50:01 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:02:06.808 02:50:01 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:02:06.808 02:50:01 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:02:06.808 02:50:01 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:02:06.808 02:50:01 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:02:06.808 02:50:01 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:02:06.808 02:50:01 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:06.808 02:50:01 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:02:06.808 02:50:01 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:02:06.808 02:50:01 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:07.066 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:07.066 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:07.066 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:07.324 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:07.583 Using 'verbs' RDMA provider 00:02:23.462 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:35.667 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:35.927 Creating mk/config.mk...done. 00:02:35.927 Creating mk/cc.flags.mk...done. 00:02:35.927 Type 'make' to build. 00:02:35.927 00:02:35.927 real 0m29.148s 00:02:35.927 user 0m12.464s 00:02:35.927 sys 0m16.049s 00:02:35.927 02:50:30 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:35.927 02:50:30 -- common/autotest_common.sh@10 -- $ set +x 00:02:35.927 ************************************ 00:02:35.927 END TEST autobuild_llvm_precompile 00:02:35.927 ************************************ 00:02:35.927 02:50:30 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:35.927 02:50:30 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:35.927 02:50:30 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:35.927 02:50:30 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:35.927 02:50:30 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:36.187 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:36.187 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.187 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.445 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:36.704 Using 'verbs' RDMA provider 00:02:49.870 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:59.852 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:59.852 Creating mk/config.mk...done. 00:02:59.852 Creating mk/cc.flags.mk...done. 00:02:59.852 Type 'make' to build. 00:02:59.852 02:50:54 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:59.852 02:50:54 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:59.853 02:50:54 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:59.853 02:50:54 -- common/autotest_common.sh@10 -- $ set +x 00:02:59.853 ************************************ 00:02:59.853 START TEST make 00:02:59.853 ************************************ 00:02:59.853 02:50:54 -- common/autotest_common.sh@1104 -- $ make -j112 00:02:59.853 make[1]: Nothing to be done for 'all'. 00:03:01.759 The Meson build system 00:03:01.759 Version: 1.3.1 00:03:01.759 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:01.759 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:01.759 Build type: native build 00:03:01.759 Project name: libvfio-user 00:03:01.759 Project version: 0.0.1 00:03:01.759 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:03:01.759 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:03:01.759 Host machine cpu family: x86_64 00:03:01.759 Host machine cpu: x86_64 00:03:01.759 Run-time dependency threads found: YES 00:03:01.759 Library dl found: YES 00:03:01.759 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:01.759 Run-time dependency json-c found: YES 0.17 00:03:01.759 Run-time dependency cmocka found: YES 1.1.7 00:03:01.759 Program pytest-3 found: NO 00:03:01.759 Program flake8 found: NO 00:03:01.759 Program misspell-fixer found: NO 00:03:01.759 Program restructuredtext-lint found: NO 00:03:01.759 Program valgrind found: YES (/usr/bin/valgrind) 00:03:01.759 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:01.759 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:01.759 Compiler for C supports arguments -Wwrite-strings: YES 00:03:01.759 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:01.759 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:01.759 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:01.759 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:01.759 Build targets in project: 8 00:03:01.759 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:01.759 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:01.759 00:03:01.759 libvfio-user 0.0.1 00:03:01.759 00:03:01.759 User defined options 00:03:01.759 buildtype : debug 00:03:01.759 default_library: static 00:03:01.759 libdir : /usr/local/lib 00:03:01.759 00:03:01.759 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:01.759 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:02.017 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:02.017 [2/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:02.017 [3/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:02.017 [4/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:02.017 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:02.017 [6/36] Compiling C object samples/null.p/null.c.o 00:03:02.017 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:02.017 [8/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:02.017 [9/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:02.017 [10/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:02.017 [11/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:02.017 [12/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:02.017 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:02.017 [14/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:02.017 [15/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:02.017 [16/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:02.017 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:02.017 [18/36] Compiling C object samples/server.p/server.c.o 00:03:02.017 [19/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:02.017 [20/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:02.017 [21/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:02.017 [22/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:02.017 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:02.017 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:02.017 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:02.017 [26/36] Compiling C object samples/client.p/client.c.o 00:03:02.018 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:02.018 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:02.018 [29/36] Linking static target lib/libvfio-user.a 00:03:02.018 [30/36] Linking target samples/client 00:03:02.018 [31/36] Linking target samples/lspci 00:03:02.018 [32/36] Linking target samples/null 00:03:02.018 [33/36] Linking target samples/server 00:03:02.018 [34/36] Linking target samples/gpio-pci-idio-16 00:03:02.018 [35/36] Linking target test/unit_tests 00:03:02.018 [36/36] Linking target samples/shadow_ioeventfd_server 00:03:02.018 INFO: autodetecting backend as ninja 00:03:02.018 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:02.018 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:02.275 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:02.533 ninja: no work to do. 00:03:05.868 CC lib/ut_mock/mock.o 00:03:05.868 CC lib/ut/ut.o 00:03:05.868 CC lib/log/log.o 00:03:05.868 CC lib/log/log_flags.o 00:03:05.868 CC lib/log/log_deprecated.o 00:03:05.868 LIB libspdk_ut_mock.a 00:03:05.868 LIB libspdk_ut.a 00:03:05.868 LIB libspdk_log.a 00:03:06.126 CC lib/ioat/ioat.o 00:03:06.126 CC lib/dma/dma.o 00:03:06.126 CXX lib/trace_parser/trace.o 00:03:06.126 CC lib/util/base64.o 00:03:06.126 CC lib/util/bit_array.o 00:03:06.126 CC lib/util/cpuset.o 00:03:06.126 CC lib/util/crc16.o 00:03:06.126 CC lib/util/crc32_ieee.o 00:03:06.126 CC lib/util/crc32.o 00:03:06.126 CC lib/util/crc32c.o 00:03:06.126 CC lib/util/fd.o 00:03:06.126 CC lib/util/crc64.o 00:03:06.127 CC lib/util/dif.o 00:03:06.127 CC lib/util/file.o 00:03:06.127 CC lib/util/hexlify.o 00:03:06.127 CC lib/util/iov.o 00:03:06.127 CC lib/util/math.o 00:03:06.127 CC lib/util/pipe.o 00:03:06.127 CC lib/util/strerror_tls.o 00:03:06.127 CC lib/util/string.o 00:03:06.127 CC lib/util/uuid.o 00:03:06.127 CC lib/util/fd_group.o 00:03:06.127 CC lib/util/xor.o 00:03:06.127 CC lib/util/zipf.o 00:03:06.127 CC lib/vfio_user/host/vfio_user_pci.o 00:03:06.127 CC lib/vfio_user/host/vfio_user.o 00:03:06.127 LIB libspdk_dma.a 00:03:06.127 LIB libspdk_ioat.a 00:03:06.385 LIB libspdk_vfio_user.a 00:03:06.385 LIB libspdk_util.a 00:03:06.645 LIB libspdk_trace_parser.a 00:03:06.645 CC lib/idxd/idxd_user.o 00:03:06.645 CC lib/idxd/idxd.o 00:03:06.645 CC lib/idxd/idxd_kernel.o 00:03:06.645 CC lib/vmd/vmd.o 00:03:06.645 CC lib/vmd/led.o 00:03:06.645 CC lib/conf/conf.o 00:03:06.645 CC lib/rdma/common.o 00:03:06.645 CC lib/rdma/rdma_verbs.o 00:03:06.645 CC lib/env_dpdk/env.o 00:03:06.645 CC lib/env_dpdk/init.o 00:03:06.645 CC lib/env_dpdk/memory.o 00:03:06.645 CC lib/env_dpdk/pci.o 00:03:06.645 CC lib/env_dpdk/threads.o 00:03:06.645 CC lib/env_dpdk/pci_vmd.o 00:03:06.645 CC lib/env_dpdk/pci_ioat.o 00:03:06.645 CC lib/env_dpdk/pci_virtio.o 00:03:06.645 CC lib/env_dpdk/pci_idxd.o 00:03:06.645 CC lib/json/json_parse.o 00:03:06.645 CC lib/env_dpdk/pci_event.o 00:03:06.645 CC lib/env_dpdk/sigbus_handler.o 00:03:06.645 CC lib/json/json_util.o 00:03:06.904 CC lib/env_dpdk/pci_dpdk.o 00:03:06.904 CC lib/json/json_write.o 00:03:06.904 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:06.904 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:06.904 LIB libspdk_conf.a 00:03:06.904 LIB libspdk_rdma.a 00:03:06.904 LIB libspdk_json.a 00:03:07.164 LIB libspdk_idxd.a 00:03:07.164 LIB libspdk_vmd.a 00:03:07.164 CC lib/jsonrpc/jsonrpc_server.o 00:03:07.164 CC lib/jsonrpc/jsonrpc_client.o 00:03:07.164 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:07.164 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:07.423 LIB libspdk_jsonrpc.a 00:03:07.683 LIB libspdk_env_dpdk.a 00:03:07.683 CC lib/rpc/rpc.o 00:03:07.943 LIB libspdk_rpc.a 00:03:08.201 CC lib/sock/sock.o 00:03:08.201 CC lib/sock/sock_rpc.o 00:03:08.201 CC lib/trace/trace.o 00:03:08.201 CC lib/trace/trace_flags.o 00:03:08.201 CC lib/trace/trace_rpc.o 00:03:08.201 CC lib/notify/notify.o 00:03:08.201 CC lib/notify/notify_rpc.o 00:03:08.201 LIB libspdk_notify.a 00:03:08.460 LIB libspdk_trace.a 00:03:08.460 LIB libspdk_sock.a 00:03:08.719 CC lib/thread/thread.o 00:03:08.719 CC lib/thread/iobuf.o 00:03:08.719 CC lib/nvme/nvme_ctrlr.o 00:03:08.719 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:08.719 CC lib/nvme/nvme_fabric.o 00:03:08.719 CC lib/nvme/nvme_ns_cmd.o 00:03:08.719 CC lib/nvme/nvme_ns.o 00:03:08.719 CC lib/nvme/nvme_pcie_common.o 00:03:08.719 CC lib/nvme/nvme_pcie.o 00:03:08.719 CC lib/nvme/nvme_qpair.o 00:03:08.719 CC lib/nvme/nvme_discovery.o 00:03:08.719 CC lib/nvme/nvme.o 00:03:08.719 CC lib/nvme/nvme_quirks.o 00:03:08.719 CC lib/nvme/nvme_transport.o 00:03:08.719 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:08.719 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:08.719 CC lib/nvme/nvme_tcp.o 00:03:08.719 CC lib/nvme/nvme_opal.o 00:03:08.719 CC lib/nvme/nvme_io_msg.o 00:03:08.719 CC lib/nvme/nvme_poll_group.o 00:03:08.719 CC lib/nvme/nvme_zns.o 00:03:08.719 CC lib/nvme/nvme_cuse.o 00:03:08.719 CC lib/nvme/nvme_rdma.o 00:03:08.719 CC lib/nvme/nvme_vfio_user.o 00:03:09.654 LIB libspdk_thread.a 00:03:09.654 CC lib/virtio/virtio.o 00:03:09.654 CC lib/virtio/virtio_vhost_user.o 00:03:09.654 CC lib/virtio/virtio_pci.o 00:03:09.654 CC lib/virtio/virtio_vfio_user.o 00:03:09.654 CC lib/blob/blobstore.o 00:03:09.654 CC lib/blob/request.o 00:03:09.654 CC lib/blob/zeroes.o 00:03:09.654 CC lib/blob/blob_bs_dev.o 00:03:09.654 CC lib/accel/accel_sw.o 00:03:09.654 CC lib/accel/accel_rpc.o 00:03:09.654 CC lib/accel/accel.o 00:03:09.654 CC lib/vfu_tgt/tgt_endpoint.o 00:03:09.654 CC lib/vfu_tgt/tgt_rpc.o 00:03:09.654 CC lib/init/subsystem_rpc.o 00:03:09.654 CC lib/init/json_config.o 00:03:09.911 CC lib/init/subsystem.o 00:03:09.911 CC lib/init/rpc.o 00:03:09.911 LIB libspdk_nvme.a 00:03:09.911 LIB libspdk_virtio.a 00:03:09.911 LIB libspdk_init.a 00:03:09.911 LIB libspdk_vfu_tgt.a 00:03:10.169 CC lib/event/app.o 00:03:10.169 CC lib/event/reactor.o 00:03:10.169 CC lib/event/log_rpc.o 00:03:10.169 CC lib/event/app_rpc.o 00:03:10.169 CC lib/event/scheduler_static.o 00:03:10.428 LIB libspdk_accel.a 00:03:10.428 LIB libspdk_event.a 00:03:10.687 CC lib/bdev/bdev.o 00:03:10.687 CC lib/bdev/bdev_rpc.o 00:03:10.687 CC lib/bdev/bdev_zone.o 00:03:10.687 CC lib/bdev/scsi_nvme.o 00:03:10.687 CC lib/bdev/part.o 00:03:11.253 LIB libspdk_blob.a 00:03:11.511 CC lib/blobfs/blobfs.o 00:03:11.511 CC lib/blobfs/tree.o 00:03:11.770 CC lib/lvol/lvol.o 00:03:12.096 LIB libspdk_blobfs.a 00:03:12.096 LIB libspdk_lvol.a 00:03:12.355 LIB libspdk_bdev.a 00:03:12.615 CC lib/ublk/ublk.o 00:03:12.615 CC lib/ublk/ublk_rpc.o 00:03:12.615 CC lib/nbd/nbd.o 00:03:12.615 CC lib/nbd/nbd_rpc.o 00:03:12.615 CC lib/nvmf/ctrlr_discovery.o 00:03:12.615 CC lib/nvmf/ctrlr.o 00:03:12.615 CC lib/nvmf/ctrlr_bdev.o 00:03:12.615 CC lib/nvmf/subsystem.o 00:03:12.615 CC lib/nvmf/nvmf.o 00:03:12.615 CC lib/nvmf/nvmf_rpc.o 00:03:12.615 CC lib/ftl/ftl_core.o 00:03:12.615 CC lib/nvmf/transport.o 00:03:12.615 CC lib/ftl/ftl_init.o 00:03:12.615 CC lib/nvmf/tcp.o 00:03:12.615 CC lib/ftl/ftl_layout.o 00:03:12.615 CC lib/nvmf/vfio_user.o 00:03:12.615 CC lib/ftl/ftl_debug.o 00:03:12.615 CC lib/nvmf/rdma.o 00:03:12.615 CC lib/ftl/ftl_io.o 00:03:12.615 CC lib/ftl/ftl_sb.o 00:03:12.615 CC lib/scsi/dev.o 00:03:12.615 CC lib/scsi/port.o 00:03:12.615 CC lib/ftl/ftl_l2p.o 00:03:12.615 CC lib/scsi/scsi.o 00:03:12.615 CC lib/scsi/lun.o 00:03:12.615 CC lib/ftl/ftl_l2p_flat.o 00:03:12.615 CC lib/ftl/ftl_band.o 00:03:12.615 CC lib/ftl/ftl_nv_cache.o 00:03:12.615 CC lib/scsi/scsi_bdev.o 00:03:12.615 CC lib/scsi/task.o 00:03:12.615 CC lib/scsi/scsi_pr.o 00:03:12.615 CC lib/ftl/ftl_band_ops.o 00:03:12.615 CC lib/scsi/scsi_rpc.o 00:03:12.874 CC lib/ftl/ftl_writer.o 00:03:12.874 CC lib/ftl/ftl_rq.o 00:03:12.874 CC lib/ftl/ftl_reloc.o 00:03:12.874 CC lib/ftl/ftl_l2p_cache.o 00:03:12.874 CC lib/ftl/ftl_p2l.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:12.874 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:12.874 CC lib/ftl/utils/ftl_conf.o 00:03:12.874 CC lib/ftl/utils/ftl_md.o 00:03:12.874 CC lib/ftl/utils/ftl_mempool.o 00:03:12.874 CC lib/ftl/utils/ftl_bitmap.o 00:03:12.874 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:12.874 CC lib/ftl/utils/ftl_property.o 00:03:12.874 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:12.874 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:12.874 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:12.874 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:12.874 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:12.874 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:12.874 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:12.874 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:12.874 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:12.874 CC lib/ftl/base/ftl_base_dev.o 00:03:12.874 CC lib/ftl/base/ftl_base_bdev.o 00:03:12.874 CC lib/ftl/ftl_trace.o 00:03:13.133 LIB libspdk_nbd.a 00:03:13.133 LIB libspdk_scsi.a 00:03:13.133 LIB libspdk_ublk.a 00:03:13.392 LIB libspdk_ftl.a 00:03:13.392 CC lib/iscsi/iscsi.o 00:03:13.392 CC lib/iscsi/conn.o 00:03:13.392 CC lib/iscsi/init_grp.o 00:03:13.392 CC lib/iscsi/portal_grp.o 00:03:13.392 CC lib/iscsi/md5.o 00:03:13.392 CC lib/iscsi/param.o 00:03:13.392 CC lib/iscsi/tgt_node.o 00:03:13.392 CC lib/iscsi/iscsi_subsystem.o 00:03:13.392 CC lib/vhost/vhost.o 00:03:13.392 CC lib/iscsi/iscsi_rpc.o 00:03:13.392 CC lib/iscsi/task.o 00:03:13.392 CC lib/vhost/vhost_rpc.o 00:03:13.392 CC lib/vhost/vhost_scsi.o 00:03:13.392 CC lib/vhost/vhost_blk.o 00:03:13.392 CC lib/vhost/rte_vhost_user.o 00:03:13.959 LIB libspdk_nvmf.a 00:03:13.959 LIB libspdk_vhost.a 00:03:14.218 LIB libspdk_iscsi.a 00:03:14.476 CC module/vfu_device/vfu_virtio.o 00:03:14.476 CC module/vfu_device/vfu_virtio_blk.o 00:03:14.476 CC module/vfu_device/vfu_virtio_scsi.o 00:03:14.735 CC module/vfu_device/vfu_virtio_rpc.o 00:03:14.735 CC module/env_dpdk/env_dpdk_rpc.o 00:03:14.735 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:14.735 LIB libspdk_env_dpdk_rpc.a 00:03:14.735 CC module/blob/bdev/blob_bdev.o 00:03:14.735 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:14.735 CC module/sock/posix/posix.o 00:03:14.735 CC module/scheduler/gscheduler/gscheduler.o 00:03:14.735 CC module/accel/dsa/accel_dsa.o 00:03:14.735 CC module/accel/dsa/accel_dsa_rpc.o 00:03:14.735 CC module/accel/iaa/accel_iaa.o 00:03:14.735 CC module/accel/iaa/accel_iaa_rpc.o 00:03:14.735 CC module/accel/error/accel_error.o 00:03:14.735 CC module/accel/error/accel_error_rpc.o 00:03:14.735 CC module/accel/ioat/accel_ioat.o 00:03:14.735 CC module/accel/ioat/accel_ioat_rpc.o 00:03:14.735 LIB libspdk_scheduler_dpdk_governor.a 00:03:14.735 LIB libspdk_scheduler_dynamic.a 00:03:14.735 LIB libspdk_scheduler_gscheduler.a 00:03:14.994 LIB libspdk_accel_error.a 00:03:14.994 LIB libspdk_accel_ioat.a 00:03:14.994 LIB libspdk_accel_iaa.a 00:03:14.994 LIB libspdk_accel_dsa.a 00:03:14.994 LIB libspdk_blob_bdev.a 00:03:14.994 LIB libspdk_vfu_device.a 00:03:15.253 LIB libspdk_sock_posix.a 00:03:15.253 CC module/bdev/null/bdev_null.o 00:03:15.253 CC module/bdev/lvol/vbdev_lvol.o 00:03:15.253 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:15.253 CC module/bdev/gpt/gpt.o 00:03:15.253 CC module/bdev/null/bdev_null_rpc.o 00:03:15.253 CC module/bdev/gpt/vbdev_gpt.o 00:03:15.253 CC module/bdev/raid/bdev_raid_rpc.o 00:03:15.253 CC module/bdev/raid/bdev_raid_sb.o 00:03:15.253 CC module/bdev/raid/bdev_raid.o 00:03:15.253 CC module/bdev/raid/raid0.o 00:03:15.253 CC module/bdev/raid/raid1.o 00:03:15.253 CC module/bdev/raid/concat.o 00:03:15.253 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:15.253 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:15.253 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:15.253 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:15.253 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:15.253 CC module/bdev/error/vbdev_error.o 00:03:15.253 CC module/bdev/split/vbdev_split.o 00:03:15.253 CC module/bdev/split/vbdev_split_rpc.o 00:03:15.253 CC module/bdev/error/vbdev_error_rpc.o 00:03:15.253 CC module/bdev/delay/vbdev_delay.o 00:03:15.253 CC module/bdev/malloc/bdev_malloc.o 00:03:15.253 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:15.253 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:15.253 CC module/bdev/ftl/bdev_ftl.o 00:03:15.253 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:15.253 CC module/bdev/nvme/bdev_nvme.o 00:03:15.253 CC module/bdev/aio/bdev_aio.o 00:03:15.253 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:15.253 CC module/bdev/nvme/nvme_rpc.o 00:03:15.253 CC module/bdev/aio/bdev_aio_rpc.o 00:03:15.253 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:15.253 CC module/bdev/nvme/bdev_mdns_client.o 00:03:15.253 CC module/bdev/passthru/vbdev_passthru.o 00:03:15.253 CC module/bdev/nvme/vbdev_opal.o 00:03:15.253 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:15.253 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:15.253 CC module/blobfs/bdev/blobfs_bdev.o 00:03:15.253 CC module/bdev/iscsi/bdev_iscsi.o 00:03:15.253 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:15.253 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:15.511 LIB libspdk_blobfs_bdev.a 00:03:15.511 LIB libspdk_bdev_split.a 00:03:15.511 LIB libspdk_bdev_gpt.a 00:03:15.511 LIB libspdk_bdev_null.a 00:03:15.511 LIB libspdk_bdev_error.a 00:03:15.511 LIB libspdk_bdev_ftl.a 00:03:15.511 LIB libspdk_bdev_aio.a 00:03:15.511 LIB libspdk_bdev_zone_block.a 00:03:15.511 LIB libspdk_bdev_passthru.a 00:03:15.511 LIB libspdk_bdev_delay.a 00:03:15.511 LIB libspdk_bdev_iscsi.a 00:03:15.512 LIB libspdk_bdev_malloc.a 00:03:15.769 LIB libspdk_bdev_lvol.a 00:03:15.769 LIB libspdk_bdev_virtio.a 00:03:15.769 LIB libspdk_bdev_raid.a 00:03:16.704 LIB libspdk_bdev_nvme.a 00:03:17.270 CC module/event/subsystems/iobuf/iobuf.o 00:03:17.270 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:17.270 CC module/event/subsystems/sock/sock.o 00:03:17.270 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:17.270 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:17.270 CC module/event/subsystems/vmd/vmd.o 00:03:17.270 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:17.270 CC module/event/subsystems/scheduler/scheduler.o 00:03:17.270 LIB libspdk_event_sock.a 00:03:17.270 LIB libspdk_event_vhost_blk.a 00:03:17.270 LIB libspdk_event_iobuf.a 00:03:17.270 LIB libspdk_event_vfu_tgt.a 00:03:17.270 LIB libspdk_event_vmd.a 00:03:17.270 LIB libspdk_event_scheduler.a 00:03:17.529 CC module/event/subsystems/accel/accel.o 00:03:17.787 LIB libspdk_event_accel.a 00:03:18.046 CC module/event/subsystems/bdev/bdev.o 00:03:18.046 LIB libspdk_event_bdev.a 00:03:18.305 CC module/event/subsystems/scsi/scsi.o 00:03:18.305 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:18.305 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:18.305 CC module/event/subsystems/nbd/nbd.o 00:03:18.305 CC module/event/subsystems/ublk/ublk.o 00:03:18.564 LIB libspdk_event_scsi.a 00:03:18.564 LIB libspdk_event_nbd.a 00:03:18.564 LIB libspdk_event_ublk.a 00:03:18.564 LIB libspdk_event_nvmf.a 00:03:18.822 CC module/event/subsystems/iscsi/iscsi.o 00:03:18.822 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:18.822 LIB libspdk_event_vhost_scsi.a 00:03:18.822 LIB libspdk_event_iscsi.a 00:03:19.080 CC app/spdk_top/spdk_top.o 00:03:19.080 CXX app/trace/trace.o 00:03:19.080 CC app/spdk_nvme_discover/discovery_aer.o 00:03:19.080 CC app/trace_record/trace_record.o 00:03:19.080 CC app/spdk_lspci/spdk_lspci.o 00:03:19.080 CC app/spdk_nvme_identify/identify.o 00:03:19.080 CC app/spdk_nvme_perf/perf.o 00:03:19.080 TEST_HEADER include/spdk/accel_module.h 00:03:19.080 TEST_HEADER include/spdk/accel.h 00:03:19.080 TEST_HEADER include/spdk/assert.h 00:03:19.080 CC test/rpc_client/rpc_client_test.o 00:03:19.080 TEST_HEADER include/spdk/barrier.h 00:03:19.080 TEST_HEADER include/spdk/base64.h 00:03:19.080 TEST_HEADER include/spdk/bdev.h 00:03:19.080 TEST_HEADER include/spdk/bdev_zone.h 00:03:19.080 TEST_HEADER include/spdk/bdev_module.h 00:03:19.349 TEST_HEADER include/spdk/bit_pool.h 00:03:19.349 TEST_HEADER include/spdk/blob_bdev.h 00:03:19.349 TEST_HEADER include/spdk/bit_array.h 00:03:19.349 TEST_HEADER include/spdk/blobfs.h 00:03:19.349 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:19.349 TEST_HEADER include/spdk/blob.h 00:03:19.349 TEST_HEADER include/spdk/conf.h 00:03:19.349 TEST_HEADER include/spdk/config.h 00:03:19.350 TEST_HEADER include/spdk/cpuset.h 00:03:19.350 TEST_HEADER include/spdk/crc16.h 00:03:19.350 TEST_HEADER include/spdk/crc32.h 00:03:19.350 TEST_HEADER include/spdk/crc64.h 00:03:19.350 TEST_HEADER include/spdk/dif.h 00:03:19.350 TEST_HEADER include/spdk/dma.h 00:03:19.350 TEST_HEADER include/spdk/endian.h 00:03:19.350 TEST_HEADER include/spdk/env_dpdk.h 00:03:19.350 TEST_HEADER include/spdk/env.h 00:03:19.350 TEST_HEADER include/spdk/event.h 00:03:19.350 TEST_HEADER include/spdk/fd_group.h 00:03:19.350 TEST_HEADER include/spdk/fd.h 00:03:19.350 TEST_HEADER include/spdk/file.h 00:03:19.350 TEST_HEADER include/spdk/gpt_spec.h 00:03:19.350 TEST_HEADER include/spdk/ftl.h 00:03:19.350 CC app/spdk_dd/spdk_dd.o 00:03:19.350 TEST_HEADER include/spdk/hexlify.h 00:03:19.350 TEST_HEADER include/spdk/histogram_data.h 00:03:19.350 TEST_HEADER include/spdk/idxd.h 00:03:19.350 TEST_HEADER include/spdk/idxd_spec.h 00:03:19.350 TEST_HEADER include/spdk/init.h 00:03:19.350 TEST_HEADER include/spdk/ioat.h 00:03:19.350 TEST_HEADER include/spdk/ioat_spec.h 00:03:19.350 CC app/nvmf_tgt/nvmf_main.o 00:03:19.350 TEST_HEADER include/spdk/json.h 00:03:19.350 TEST_HEADER include/spdk/iscsi_spec.h 00:03:19.350 TEST_HEADER include/spdk/jsonrpc.h 00:03:19.350 TEST_HEADER include/spdk/likely.h 00:03:19.350 TEST_HEADER include/spdk/log.h 00:03:19.350 TEST_HEADER include/spdk/lvol.h 00:03:19.350 TEST_HEADER include/spdk/memory.h 00:03:19.350 TEST_HEADER include/spdk/mmio.h 00:03:19.350 TEST_HEADER include/spdk/nbd.h 00:03:19.350 TEST_HEADER include/spdk/notify.h 00:03:19.350 CC app/iscsi_tgt/iscsi_tgt.o 00:03:19.350 TEST_HEADER include/spdk/nvme_intel.h 00:03:19.350 TEST_HEADER include/spdk/nvme.h 00:03:19.350 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:19.350 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:19.350 TEST_HEADER include/spdk/nvme_zns.h 00:03:19.350 TEST_HEADER include/spdk/nvme_spec.h 00:03:19.350 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:19.350 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:19.350 TEST_HEADER include/spdk/nvmf.h 00:03:19.350 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:19.350 TEST_HEADER include/spdk/nvmf_spec.h 00:03:19.350 TEST_HEADER include/spdk/nvmf_transport.h 00:03:19.350 TEST_HEADER include/spdk/opal.h 00:03:19.350 TEST_HEADER include/spdk/opal_spec.h 00:03:19.350 TEST_HEADER include/spdk/pci_ids.h 00:03:19.350 CC app/spdk_tgt/spdk_tgt.o 00:03:19.350 TEST_HEADER include/spdk/pipe.h 00:03:19.350 TEST_HEADER include/spdk/queue.h 00:03:19.350 TEST_HEADER include/spdk/rpc.h 00:03:19.350 TEST_HEADER include/spdk/reduce.h 00:03:19.350 CC app/vhost/vhost.o 00:03:19.350 TEST_HEADER include/spdk/scheduler.h 00:03:19.350 TEST_HEADER include/spdk/scsi.h 00:03:19.350 TEST_HEADER include/spdk/sock.h 00:03:19.350 TEST_HEADER include/spdk/scsi_spec.h 00:03:19.350 TEST_HEADER include/spdk/stdinc.h 00:03:19.350 TEST_HEADER include/spdk/string.h 00:03:19.350 TEST_HEADER include/spdk/trace.h 00:03:19.350 TEST_HEADER include/spdk/trace_parser.h 00:03:19.350 TEST_HEADER include/spdk/thread.h 00:03:19.350 TEST_HEADER include/spdk/ublk.h 00:03:19.350 TEST_HEADER include/spdk/tree.h 00:03:19.350 TEST_HEADER include/spdk/util.h 00:03:19.350 TEST_HEADER include/spdk/uuid.h 00:03:19.350 TEST_HEADER include/spdk/version.h 00:03:19.350 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:19.350 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:19.350 TEST_HEADER include/spdk/vhost.h 00:03:19.350 TEST_HEADER include/spdk/vmd.h 00:03:19.350 TEST_HEADER include/spdk/zipf.h 00:03:19.350 TEST_HEADER include/spdk/xor.h 00:03:19.350 CXX test/cpp_headers/accel.o 00:03:19.350 CXX test/cpp_headers/accel_module.o 00:03:19.350 CXX test/cpp_headers/assert.o 00:03:19.350 CXX test/cpp_headers/barrier.o 00:03:19.350 CXX test/cpp_headers/base64.o 00:03:19.350 CXX test/cpp_headers/bdev.o 00:03:19.350 CXX test/cpp_headers/bdev_module.o 00:03:19.350 CXX test/cpp_headers/bdev_zone.o 00:03:19.350 CXX test/cpp_headers/bit_pool.o 00:03:19.350 CXX test/cpp_headers/bit_array.o 00:03:19.350 CXX test/cpp_headers/blob_bdev.o 00:03:19.350 CXX test/cpp_headers/blobfs_bdev.o 00:03:19.350 CXX test/cpp_headers/blobfs.o 00:03:19.350 CXX test/cpp_headers/blob.o 00:03:19.350 CXX test/cpp_headers/conf.o 00:03:19.350 CXX test/cpp_headers/config.o 00:03:19.350 CXX test/cpp_headers/cpuset.o 00:03:19.350 CXX test/cpp_headers/crc16.o 00:03:19.350 CXX test/cpp_headers/crc32.o 00:03:19.350 CXX test/cpp_headers/crc64.o 00:03:19.350 CXX test/cpp_headers/dma.o 00:03:19.350 CXX test/cpp_headers/dif.o 00:03:19.350 CXX test/cpp_headers/env_dpdk.o 00:03:19.350 CXX test/cpp_headers/endian.o 00:03:19.350 CXX test/cpp_headers/env.o 00:03:19.350 CXX test/cpp_headers/event.o 00:03:19.350 CXX test/cpp_headers/fd_group.o 00:03:19.350 CXX test/cpp_headers/fd.o 00:03:19.350 CXX test/cpp_headers/file.o 00:03:19.350 CXX test/cpp_headers/gpt_spec.o 00:03:19.350 CXX test/cpp_headers/ftl.o 00:03:19.350 CXX test/cpp_headers/hexlify.o 00:03:19.350 CC examples/ioat/verify/verify.o 00:03:19.350 CXX test/cpp_headers/histogram_data.o 00:03:19.350 CXX test/cpp_headers/idxd.o 00:03:19.350 CXX test/cpp_headers/idxd_spec.o 00:03:19.350 CXX test/cpp_headers/init.o 00:03:19.350 CC examples/nvme/hello_world/hello_world.o 00:03:19.350 CC examples/ioat/perf/perf.o 00:03:19.350 CC examples/nvme/reconnect/reconnect.o 00:03:19.350 CC examples/nvme/arbitration/arbitration.o 00:03:19.350 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:19.350 CC test/app/histogram_perf/histogram_perf.o 00:03:19.350 CC examples/vmd/lsvmd/lsvmd.o 00:03:19.350 CC examples/util/zipf/zipf.o 00:03:19.350 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:19.350 CC examples/nvme/hotplug/hotplug.o 00:03:19.350 CC examples/idxd/perf/perf.o 00:03:19.350 CC examples/sock/hello_world/hello_sock.o 00:03:19.350 CC examples/accel/perf/accel_perf.o 00:03:19.350 CC examples/nvme/abort/abort.o 00:03:19.350 CC examples/vmd/led/led.o 00:03:19.350 CC test/app/jsoncat/jsoncat.o 00:03:19.350 CC test/app/stub/stub.o 00:03:19.350 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:19.350 CC test/env/memory/memory_ut.o 00:03:19.350 CC test/thread/poller_perf/poller_perf.o 00:03:19.350 CC test/env/pci/pci_ut.o 00:03:19.350 CC test/nvme/e2edp/nvme_dp.o 00:03:19.350 CC test/env/vtophys/vtophys.o 00:03:19.350 CC test/thread/lock/spdk_lock.o 00:03:19.350 CC test/nvme/aer/aer.o 00:03:19.350 CC test/nvme/reserve/reserve.o 00:03:19.350 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:19.350 CC app/fio/nvme/fio_plugin.o 00:03:19.350 CC test/nvme/connect_stress/connect_stress.o 00:03:19.350 CC test/nvme/overhead/overhead.o 00:03:19.350 CC test/nvme/reset/reset.o 00:03:19.350 CC test/nvme/sgl/sgl.o 00:03:19.350 CC test/nvme/err_injection/err_injection.o 00:03:19.350 LINK spdk_lspci 00:03:19.350 CC test/nvme/startup/startup.o 00:03:19.350 CC test/event/reactor/reactor.o 00:03:19.350 CC test/event/reactor_perf/reactor_perf.o 00:03:19.350 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:19.350 CC test/nvme/fdp/fdp.o 00:03:19.350 CC test/nvme/compliance/nvme_compliance.o 00:03:19.350 CC test/event/event_perf/event_perf.o 00:03:19.350 CC test/nvme/cuse/cuse.o 00:03:19.350 CXX test/cpp_headers/ioat.o 00:03:19.350 CC test/nvme/fused_ordering/fused_ordering.o 00:03:19.350 CC test/nvme/simple_copy/simple_copy.o 00:03:19.350 CC examples/blob/hello_world/hello_blob.o 00:03:19.350 CC examples/bdev/hello_world/hello_bdev.o 00:03:19.350 CC examples/blob/cli/blobcli.o 00:03:19.350 CC test/app/bdev_svc/bdev_svc.o 00:03:19.350 CC test/nvme/boot_partition/boot_partition.o 00:03:19.350 CC test/event/app_repeat/app_repeat.o 00:03:19.350 CC examples/bdev/bdevperf/bdevperf.o 00:03:19.350 CC examples/nvmf/nvmf/nvmf.o 00:03:19.350 CC test/accel/dif/dif.o 00:03:19.350 CC test/dma/test_dma/test_dma.o 00:03:19.350 CC examples/thread/thread/thread_ex.o 00:03:19.350 CC test/blobfs/mkfs/mkfs.o 00:03:19.350 CC test/event/scheduler/scheduler.o 00:03:19.350 CC app/fio/bdev/fio_plugin.o 00:03:19.350 CC test/bdev/bdevio/bdevio.o 00:03:19.350 LINK rpc_client_test 00:03:19.350 LINK spdk_nvme_discover 00:03:19.350 CC test/env/mem_callbacks/mem_callbacks.o 00:03:19.612 CC test/lvol/esnap/esnap.o 00:03:19.612 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:19.612 LINK nvmf_tgt 00:03:19.612 LINK lsvmd 00:03:19.612 LINK interrupt_tgt 00:03:19.612 LINK spdk_trace_record 00:03:19.612 CXX test/cpp_headers/ioat_spec.o 00:03:19.612 LINK vhost 00:03:19.612 CXX test/cpp_headers/iscsi_spec.o 00:03:19.612 CXX test/cpp_headers/json.o 00:03:19.612 CXX test/cpp_headers/jsonrpc.o 00:03:19.612 LINK led 00:03:19.612 LINK histogram_perf 00:03:19.612 LINK jsoncat 00:03:19.612 CXX test/cpp_headers/likely.o 00:03:19.612 CXX test/cpp_headers/log.o 00:03:19.612 CXX test/cpp_headers/lvol.o 00:03:19.612 CXX test/cpp_headers/memory.o 00:03:19.612 LINK zipf 00:03:19.612 CXX test/cpp_headers/mmio.o 00:03:19.612 LINK iscsi_tgt 00:03:19.612 CXX test/cpp_headers/nbd.o 00:03:19.612 CXX test/cpp_headers/notify.o 00:03:19.612 CXX test/cpp_headers/nvme.o 00:03:19.612 CXX test/cpp_headers/nvme_intel.o 00:03:19.612 CXX test/cpp_headers/nvme_ocssd.o 00:03:19.612 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:19.612 CXX test/cpp_headers/nvme_spec.o 00:03:19.612 CXX test/cpp_headers/nvme_zns.o 00:03:19.612 CXX test/cpp_headers/nvmf_cmd.o 00:03:19.612 LINK poller_perf 00:03:19.612 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:19.612 CXX test/cpp_headers/nvmf.o 00:03:19.612 CXX test/cpp_headers/nvmf_spec.o 00:03:19.612 CXX test/cpp_headers/nvmf_transport.o 00:03:19.612 CXX test/cpp_headers/opal.o 00:03:19.612 LINK reactor_perf 00:03:19.612 LINK vtophys 00:03:19.612 CXX test/cpp_headers/opal_spec.o 00:03:19.612 LINK reactor 00:03:19.612 CXX test/cpp_headers/pci_ids.o 00:03:19.612 LINK spdk_tgt 00:03:19.612 CXX test/cpp_headers/pipe.o 00:03:19.612 CXX test/cpp_headers/queue.o 00:03:19.612 LINK pmr_persistence 00:03:19.612 LINK event_perf 00:03:19.612 CXX test/cpp_headers/reduce.o 00:03:19.612 CXX test/cpp_headers/rpc.o 00:03:19.612 LINK env_dpdk_post_init 00:03:19.612 CXX test/cpp_headers/scheduler.o 00:03:19.612 LINK connect_stress 00:03:19.612 LINK app_repeat 00:03:19.612 LINK startup 00:03:19.612 LINK ioat_perf 00:03:19.612 LINK reserve 00:03:19.612 CXX test/cpp_headers/scsi.o 00:03:19.612 LINK stub 00:03:19.612 LINK cmb_copy 00:03:19.612 LINK boot_partition 00:03:19.612 LINK doorbell_aers 00:03:19.612 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:03:19.612 struct spdk_nvme_fdp_ruhs ruhs; 00:03:19.612 ^ 00:03:19.612 LINK err_injection 00:03:19.612 LINK hello_sock 00:03:19.612 CXX test/cpp_headers/scsi_spec.o 00:03:19.612 LINK verify 00:03:19.612 CXX test/cpp_headers/sock.o 00:03:19.612 LINK hello_world 00:03:19.612 LINK bdev_svc 00:03:19.612 LINK hotplug 00:03:19.612 LINK mkfs 00:03:19.612 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:19.612 CXX test/cpp_headers/stdinc.o 00:03:19.612 LINK fused_ordering 00:03:19.612 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:19.612 LINK spdk_trace 00:03:19.612 LINK hello_blob 00:03:19.612 LINK fdp 00:03:19.612 LINK reset 00:03:19.612 LINK simple_copy 00:03:19.612 LINK thread 00:03:19.873 LINK overhead 00:03:19.873 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:19.873 LINK scheduler 00:03:19.873 LINK hello_bdev 00:03:19.873 LINK mem_callbacks 00:03:19.873 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:19.873 CXX test/cpp_headers/string.o 00:03:19.873 LINK nvme_dp 00:03:19.873 CXX test/cpp_headers/thread.o 00:03:19.873 LINK reconnect 00:03:19.873 LINK aer 00:03:19.873 LINK sgl 00:03:19.873 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:19.873 CXX test/cpp_headers/trace.o 00:03:19.873 CXX test/cpp_headers/trace_parser.o 00:03:19.873 CXX test/cpp_headers/tree.o 00:03:19.873 CXX test/cpp_headers/ublk.o 00:03:19.873 CXX test/cpp_headers/util.o 00:03:19.873 CXX test/cpp_headers/uuid.o 00:03:19.873 CXX test/cpp_headers/version.o 00:03:19.873 LINK nvmf 00:03:19.873 CXX test/cpp_headers/vfio_user_pci.o 00:03:19.873 CXX test/cpp_headers/vfio_user_spec.o 00:03:19.873 CXX test/cpp_headers/vhost.o 00:03:19.873 CXX test/cpp_headers/vmd.o 00:03:19.873 CXX test/cpp_headers/xor.o 00:03:19.873 CXX test/cpp_headers/zipf.o 00:03:19.873 LINK spdk_dd 00:03:19.873 LINK test_dma 00:03:19.873 LINK bdevio 00:03:19.873 LINK idxd_perf 00:03:19.873 LINK arbitration 00:03:19.873 LINK dif 00:03:19.873 LINK abort 00:03:20.132 LINK blobcli 00:03:20.132 LINK nvme_manage 00:03:20.132 LINK accel_perf 00:03:20.132 LINK nvme_compliance 00:03:20.132 1 warning generated. 00:03:20.132 LINK pci_ut 00:03:20.132 LINK memory_ut 00:03:20.132 LINK spdk_nvme_identify 00:03:20.132 LINK llvm_vfio_fuzz 00:03:20.132 LINK spdk_nvme 00:03:20.132 LINK nvme_fuzz 00:03:20.132 LINK vhost_fuzz 00:03:20.391 LINK spdk_bdev 00:03:20.391 LINK spdk_top 00:03:20.391 LINK bdevperf 00:03:20.391 LINK spdk_nvme_perf 00:03:20.650 LINK cuse 00:03:20.650 LINK llvm_nvme_fuzz 00:03:21.219 LINK spdk_lock 00:03:21.219 LINK iscsi_fuzz 00:03:23.123 LINK esnap 00:03:23.692 00:03:23.692 real 0m23.992s 00:03:23.692 user 4m35.158s 00:03:23.692 sys 1m50.864s 00:03:23.692 02:51:18 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:23.692 02:51:18 -- common/autotest_common.sh@10 -- $ set +x 00:03:23.692 ************************************ 00:03:23.692 END TEST make 00:03:23.692 ************************************ 00:03:23.692 02:51:18 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:23.692 02:51:18 -- nvmf/common.sh@7 -- # uname -s 00:03:23.692 02:51:18 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:23.692 02:51:18 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:23.692 02:51:18 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:23.692 02:51:18 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:23.692 02:51:18 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:23.692 02:51:18 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:23.692 02:51:18 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:23.692 02:51:18 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:23.692 02:51:18 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:23.692 02:51:18 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:23.692 02:51:18 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:23.692 02:51:18 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:23.692 02:51:18 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:23.692 02:51:18 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:23.692 02:51:18 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:23.692 02:51:18 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:23.692 02:51:18 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:23.692 02:51:18 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:23.692 02:51:18 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:23.692 02:51:18 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:23.692 02:51:18 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:23.692 02:51:18 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:23.692 02:51:18 -- paths/export.sh@5 -- # export PATH 00:03:23.692 02:51:18 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:23.692 02:51:18 -- nvmf/common.sh@46 -- # : 0 00:03:23.692 02:51:18 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:23.692 02:51:18 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:23.692 02:51:18 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:23.692 02:51:18 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:23.692 02:51:18 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:23.692 02:51:18 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:23.692 02:51:18 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:23.692 02:51:18 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:23.692 02:51:18 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:23.692 02:51:18 -- spdk/autotest.sh@32 -- # uname -s 00:03:23.692 02:51:18 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:23.692 02:51:18 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:23.692 02:51:18 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:23.692 02:51:18 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:23.692 02:51:18 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:23.692 02:51:18 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:23.692 02:51:18 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:23.692 02:51:18 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:23.692 02:51:18 -- spdk/autotest.sh@48 -- # udevadm_pid=597940 00:03:23.692 02:51:18 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:23.692 02:51:18 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:23.692 02:51:18 -- spdk/autotest.sh@54 -- # echo 597942 00:03:23.692 02:51:18 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:23.692 02:51:18 -- spdk/autotest.sh@56 -- # echo 597943 00:03:23.692 02:51:18 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:23.692 02:51:18 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:23.692 02:51:18 -- spdk/autotest.sh@60 -- # echo 597944 00:03:23.692 02:51:18 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:23.692 02:51:18 -- spdk/autotest.sh@62 -- # echo 597945 00:03:23.692 02:51:18 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:23.692 02:51:18 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:23.692 02:51:18 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:23.692 02:51:18 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:23.692 02:51:18 -- common/autotest_common.sh@10 -- # set +x 00:03:23.692 02:51:18 -- spdk/autotest.sh@70 -- # create_test_list 00:03:23.692 02:51:18 -- common/autotest_common.sh@736 -- # xtrace_disable 00:03:23.692 02:51:18 -- common/autotest_common.sh@10 -- # set +x 00:03:23.692 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:23.692 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:23.692 02:51:18 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:23.692 02:51:18 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:23.692 02:51:18 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:23.692 02:51:18 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:23.692 02:51:18 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:23.692 02:51:18 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:23.692 02:51:18 -- common/autotest_common.sh@1440 -- # uname 00:03:23.692 02:51:18 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:03:23.692 02:51:18 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:23.692 02:51:18 -- common/autotest_common.sh@1460 -- # uname 00:03:23.692 02:51:18 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:03:23.692 02:51:18 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:03:23.692 02:51:18 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:03:23.692 02:51:18 -- spdk/autotest.sh@83 -- # hash lcov 00:03:23.692 02:51:18 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:23.692 02:51:18 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:03:23.692 02:51:18 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:23.692 02:51:18 -- common/autotest_common.sh@10 -- # set +x 00:03:23.692 02:51:18 -- spdk/autotest.sh@102 -- # rm -f 00:03:23.951 02:51:18 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:27.237 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:27.237 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:27.237 02:51:22 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:27.237 02:51:22 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:27.237 02:51:22 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:27.237 02:51:22 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:27.237 02:51:22 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:27.237 02:51:22 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:27.237 02:51:22 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:27.237 02:51:22 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:27.237 02:51:22 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:27.237 02:51:22 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:27.237 02:51:22 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:27.237 02:51:22 -- spdk/autotest.sh@121 -- # grep -v p 00:03:27.237 02:51:22 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:27.237 02:51:22 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:27.237 02:51:22 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:27.237 02:51:22 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:27.237 02:51:22 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:27.237 No valid GPT data, bailing 00:03:27.237 02:51:22 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:27.237 02:51:22 -- scripts/common.sh@393 -- # pt= 00:03:27.237 02:51:22 -- scripts/common.sh@394 -- # return 1 00:03:27.237 02:51:22 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:27.496 1+0 records in 00:03:27.496 1+0 records out 00:03:27.496 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00170322 s, 616 MB/s 00:03:27.496 02:51:22 -- spdk/autotest.sh@129 -- # sync 00:03:27.496 02:51:22 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:27.496 02:51:22 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:27.496 02:51:22 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:35.617 02:51:29 -- spdk/autotest.sh@135 -- # uname -s 00:03:35.617 02:51:29 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:35.617 02:51:29 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:35.617 02:51:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:35.617 02:51:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:35.617 02:51:29 -- common/autotest_common.sh@10 -- # set +x 00:03:35.617 ************************************ 00:03:35.617 START TEST setup.sh 00:03:35.617 ************************************ 00:03:35.617 02:51:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:35.617 * Looking for test storage... 00:03:35.617 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:35.617 02:51:29 -- setup/test-setup.sh@10 -- # uname -s 00:03:35.617 02:51:29 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:35.617 02:51:29 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:35.617 02:51:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:35.617 02:51:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:35.617 02:51:29 -- common/autotest_common.sh@10 -- # set +x 00:03:35.617 ************************************ 00:03:35.617 START TEST acl 00:03:35.617 ************************************ 00:03:35.617 02:51:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:35.617 * Looking for test storage... 00:03:35.617 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:35.617 02:51:29 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:35.617 02:51:29 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:35.617 02:51:29 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:35.617 02:51:29 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:35.617 02:51:29 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:35.617 02:51:29 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:35.617 02:51:29 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:35.617 02:51:29 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:35.617 02:51:29 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:35.617 02:51:29 -- setup/acl.sh@12 -- # devs=() 00:03:35.617 02:51:29 -- setup/acl.sh@12 -- # declare -a devs 00:03:35.617 02:51:29 -- setup/acl.sh@13 -- # drivers=() 00:03:35.617 02:51:29 -- setup/acl.sh@13 -- # declare -A drivers 00:03:35.617 02:51:29 -- setup/acl.sh@51 -- # setup reset 00:03:35.617 02:51:29 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:35.617 02:51:29 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:38.908 02:51:33 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:38.908 02:51:33 -- setup/acl.sh@16 -- # local dev driver 00:03:38.908 02:51:33 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:38.908 02:51:33 -- setup/acl.sh@15 -- # setup output status 00:03:38.908 02:51:33 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:38.908 02:51:33 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:41.446 Hugepages 00:03:41.446 node hugesize free / total 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # continue 00:03:41.446 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # continue 00:03:41.446 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # continue 00:03:41.446 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.446 00:03:41.446 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # continue 00:03:41.446 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.446 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.447 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.447 02:51:36 -- setup/acl.sh@20 -- # continue 00:03:41.447 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.707 02:51:36 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:41.707 02:51:36 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:41.707 02:51:36 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:41.707 02:51:36 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:41.707 02:51:36 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:41.707 02:51:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.707 02:51:36 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:41.707 02:51:36 -- setup/acl.sh@54 -- # run_test denied denied 00:03:41.707 02:51:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:41.707 02:51:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:41.707 02:51:36 -- common/autotest_common.sh@10 -- # set +x 00:03:41.707 ************************************ 00:03:41.707 START TEST denied 00:03:41.707 ************************************ 00:03:41.707 02:51:36 -- common/autotest_common.sh@1104 -- # denied 00:03:41.707 02:51:36 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:41.707 02:51:36 -- setup/acl.sh@38 -- # setup output config 00:03:41.707 02:51:36 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:41.707 02:51:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:41.707 02:51:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:45.000 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:45.000 02:51:40 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:45.000 02:51:40 -- setup/acl.sh@28 -- # local dev driver 00:03:45.000 02:51:40 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:45.000 02:51:40 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:45.000 02:51:40 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:45.000 02:51:40 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:45.000 02:51:40 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:45.000 02:51:40 -- setup/acl.sh@41 -- # setup reset 00:03:45.000 02:51:40 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:45.000 02:51:40 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:50.370 00:03:50.370 real 0m7.921s 00:03:50.370 user 0m2.596s 00:03:50.370 sys 0m4.703s 00:03:50.370 02:51:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:50.370 02:51:44 -- common/autotest_common.sh@10 -- # set +x 00:03:50.370 ************************************ 00:03:50.370 END TEST denied 00:03:50.370 ************************************ 00:03:50.370 02:51:44 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:50.370 02:51:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:50.370 02:51:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:50.370 02:51:44 -- common/autotest_common.sh@10 -- # set +x 00:03:50.370 ************************************ 00:03:50.370 START TEST allowed 00:03:50.370 ************************************ 00:03:50.370 02:51:44 -- common/autotest_common.sh@1104 -- # allowed 00:03:50.370 02:51:44 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:50.370 02:51:44 -- setup/acl.sh@45 -- # setup output config 00:03:50.370 02:51:44 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:50.370 02:51:44 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.370 02:51:44 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:54.562 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:54.562 02:51:49 -- setup/acl.sh@47 -- # verify 00:03:54.562 02:51:49 -- setup/acl.sh@28 -- # local dev driver 00:03:54.562 02:51:49 -- setup/acl.sh@48 -- # setup reset 00:03:54.562 02:51:49 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:54.562 02:51:49 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:58.758 00:03:58.758 real 0m8.706s 00:03:58.758 user 0m2.544s 00:03:58.758 sys 0m4.768s 00:03:58.758 02:51:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:58.758 02:51:53 -- common/autotest_common.sh@10 -- # set +x 00:03:58.758 ************************************ 00:03:58.758 END TEST allowed 00:03:58.758 ************************************ 00:03:58.758 00:03:58.758 real 0m23.897s 00:03:58.758 user 0m7.731s 00:03:58.758 sys 0m14.451s 00:03:58.758 02:51:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:58.758 02:51:53 -- common/autotest_common.sh@10 -- # set +x 00:03:58.758 ************************************ 00:03:58.758 END TEST acl 00:03:58.758 ************************************ 00:03:58.758 02:51:53 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:58.758 02:51:53 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:58.758 02:51:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:58.758 02:51:53 -- common/autotest_common.sh@10 -- # set +x 00:03:58.758 ************************************ 00:03:58.758 START TEST hugepages 00:03:58.758 ************************************ 00:03:58.758 02:51:53 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:58.758 * Looking for test storage... 00:03:58.758 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:58.758 02:51:53 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:58.758 02:51:53 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:58.758 02:51:53 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:58.758 02:51:53 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:58.758 02:51:53 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:58.758 02:51:53 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:58.758 02:51:53 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:58.758 02:51:53 -- setup/common.sh@18 -- # local node= 00:03:58.758 02:51:53 -- setup/common.sh@19 -- # local var val 00:03:58.758 02:51:53 -- setup/common.sh@20 -- # local mem_f mem 00:03:58.758 02:51:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.758 02:51:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:58.758 02:51:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:58.758 02:51:53 -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.758 02:51:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.758 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.758 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.758 02:51:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40487656 kB' 'MemAvailable: 42870872 kB' 'Buffers: 12536 kB' 'Cached: 11211572 kB' 'SwapCached: 16 kB' 'Active: 9445280 kB' 'Inactive: 2354388 kB' 'Active(anon): 8969928 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579204 kB' 'Mapped: 186684 kB' 'Shmem: 8451456 kB' 'KReclaimable: 248160 kB' 'Slab: 784008 kB' 'SReclaimable: 248160 kB' 'SUnreclaim: 535848 kB' 'KernelStack: 21984 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439068 kB' 'Committed_AS: 10396636 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213492 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:58.758 02:51:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.758 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.758 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.758 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.758 02:51:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.758 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.758 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.758 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.758 02:51:53 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.758 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.758 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.758 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.759 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.759 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # continue 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.760 02:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.760 02:51:53 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:58.760 02:51:53 -- setup/common.sh@33 -- # echo 2048 00:03:58.760 02:51:53 -- setup/common.sh@33 -- # return 0 00:03:58.760 02:51:53 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:58.760 02:51:53 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:58.760 02:51:53 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:58.760 02:51:53 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:58.760 02:51:53 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:58.760 02:51:53 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:58.760 02:51:53 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:58.760 02:51:53 -- setup/hugepages.sh@207 -- # get_nodes 00:03:58.760 02:51:53 -- setup/hugepages.sh@27 -- # local node 00:03:58.760 02:51:53 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:58.760 02:51:53 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:58.760 02:51:53 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:58.760 02:51:53 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:58.760 02:51:53 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:58.760 02:51:53 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:58.760 02:51:53 -- setup/hugepages.sh@208 -- # clear_hp 00:03:58.760 02:51:53 -- setup/hugepages.sh@37 -- # local node hp 00:03:58.760 02:51:53 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:58.760 02:51:53 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:58.760 02:51:53 -- setup/hugepages.sh@41 -- # echo 0 00:03:58.760 02:51:53 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:58.760 02:51:53 -- setup/hugepages.sh@41 -- # echo 0 00:03:58.760 02:51:53 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:58.760 02:51:53 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:58.760 02:51:53 -- setup/hugepages.sh@41 -- # echo 0 00:03:58.760 02:51:53 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:58.760 02:51:53 -- setup/hugepages.sh@41 -- # echo 0 00:03:58.760 02:51:53 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:58.760 02:51:53 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:58.760 02:51:53 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:58.760 02:51:53 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:58.760 02:51:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:58.760 02:51:53 -- common/autotest_common.sh@10 -- # set +x 00:03:58.760 ************************************ 00:03:58.760 START TEST default_setup 00:03:58.760 ************************************ 00:03:58.760 02:51:53 -- common/autotest_common.sh@1104 -- # default_setup 00:03:58.760 02:51:53 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:58.760 02:51:53 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:58.760 02:51:53 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:58.760 02:51:53 -- setup/hugepages.sh@51 -- # shift 00:03:58.760 02:51:53 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:58.760 02:51:53 -- setup/hugepages.sh@52 -- # local node_ids 00:03:58.760 02:51:53 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:58.760 02:51:53 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:58.760 02:51:53 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:58.760 02:51:53 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:58.760 02:51:53 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:58.760 02:51:53 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:58.760 02:51:53 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:58.760 02:51:53 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:58.760 02:51:53 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:58.760 02:51:53 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:58.760 02:51:53 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:58.760 02:51:53 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:58.760 02:51:53 -- setup/hugepages.sh@73 -- # return 0 00:03:58.760 02:51:53 -- setup/hugepages.sh@137 -- # setup output 00:03:58.760 02:51:53 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:58.760 02:51:53 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:02.053 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:02.053 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:03.435 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:03.435 02:51:58 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:03.435 02:51:58 -- setup/hugepages.sh@89 -- # local node 00:04:03.435 02:51:58 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:03.435 02:51:58 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:03.435 02:51:58 -- setup/hugepages.sh@92 -- # local surp 00:04:03.435 02:51:58 -- setup/hugepages.sh@93 -- # local resv 00:04:03.435 02:51:58 -- setup/hugepages.sh@94 -- # local anon 00:04:03.435 02:51:58 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:03.435 02:51:58 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:03.435 02:51:58 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:03.435 02:51:58 -- setup/common.sh@18 -- # local node= 00:04:03.435 02:51:58 -- setup/common.sh@19 -- # local var val 00:04:03.435 02:51:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.435 02:51:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.435 02:51:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.435 02:51:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.435 02:51:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.435 02:51:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42755992 kB' 'MemAvailable: 45139232 kB' 'Buffers: 12536 kB' 'Cached: 11211692 kB' 'SwapCached: 16 kB' 'Active: 9460032 kB' 'Inactive: 2354388 kB' 'Active(anon): 8984680 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592872 kB' 'Mapped: 186964 kB' 'Shmem: 8451576 kB' 'KReclaimable: 248208 kB' 'Slab: 781964 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 533756 kB' 'KernelStack: 22096 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10413120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.435 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.435 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.436 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.436 02:51:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.436 02:51:58 -- setup/common.sh@33 -- # echo 0 00:04:03.436 02:51:58 -- setup/common.sh@33 -- # return 0 00:04:03.436 02:51:58 -- setup/hugepages.sh@97 -- # anon=0 00:04:03.436 02:51:58 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:03.436 02:51:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.436 02:51:58 -- setup/common.sh@18 -- # local node= 00:04:03.436 02:51:58 -- setup/common.sh@19 -- # local var val 00:04:03.436 02:51:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.437 02:51:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.437 02:51:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.437 02:51:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.437 02:51:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.437 02:51:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42756900 kB' 'MemAvailable: 45140140 kB' 'Buffers: 12536 kB' 'Cached: 11211696 kB' 'SwapCached: 16 kB' 'Active: 9458624 kB' 'Inactive: 2354388 kB' 'Active(anon): 8983272 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592300 kB' 'Mapped: 186864 kB' 'Shmem: 8451580 kB' 'KReclaimable: 248208 kB' 'Slab: 781956 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 533748 kB' 'KernelStack: 22128 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10414524 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213396 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.437 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.437 02:51:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.438 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.438 02:51:58 -- setup/common.sh@33 -- # echo 0 00:04:03.438 02:51:58 -- setup/common.sh@33 -- # return 0 00:04:03.438 02:51:58 -- setup/hugepages.sh@99 -- # surp=0 00:04:03.438 02:51:58 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:03.438 02:51:58 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:03.438 02:51:58 -- setup/common.sh@18 -- # local node= 00:04:03.438 02:51:58 -- setup/common.sh@19 -- # local var val 00:04:03.438 02:51:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.438 02:51:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.438 02:51:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.438 02:51:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.438 02:51:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.438 02:51:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.438 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42755172 kB' 'MemAvailable: 45138412 kB' 'Buffers: 12536 kB' 'Cached: 11211696 kB' 'SwapCached: 16 kB' 'Active: 9458828 kB' 'Inactive: 2354388 kB' 'Active(anon): 8983476 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592452 kB' 'Mapped: 186864 kB' 'Shmem: 8451580 kB' 'KReclaimable: 248208 kB' 'Slab: 781956 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 533748 kB' 'KernelStack: 22096 kB' 'PageTables: 8520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10414540 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213428 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.439 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.439 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.701 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.701 02:51:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.702 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.702 02:51:58 -- setup/common.sh@33 -- # echo 0 00:04:03.702 02:51:58 -- setup/common.sh@33 -- # return 0 00:04:03.702 02:51:58 -- setup/hugepages.sh@100 -- # resv=0 00:04:03.702 02:51:58 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:03.702 nr_hugepages=1024 00:04:03.702 02:51:58 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:03.702 resv_hugepages=0 00:04:03.702 02:51:58 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:03.702 surplus_hugepages=0 00:04:03.702 02:51:58 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:03.702 anon_hugepages=0 00:04:03.702 02:51:58 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:03.702 02:51:58 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:03.702 02:51:58 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:03.702 02:51:58 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:03.702 02:51:58 -- setup/common.sh@18 -- # local node= 00:04:03.702 02:51:58 -- setup/common.sh@19 -- # local var val 00:04:03.702 02:51:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.702 02:51:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.702 02:51:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.702 02:51:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.702 02:51:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.702 02:51:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.702 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42754728 kB' 'MemAvailable: 45137968 kB' 'Buffers: 12536 kB' 'Cached: 11211716 kB' 'SwapCached: 16 kB' 'Active: 9458420 kB' 'Inactive: 2354388 kB' 'Active(anon): 8983068 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591912 kB' 'Mapped: 186864 kB' 'Shmem: 8451600 kB' 'KReclaimable: 248208 kB' 'Slab: 781956 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 533748 kB' 'KernelStack: 22032 kB' 'PageTables: 8644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10414392 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.703 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.703 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.704 02:51:58 -- setup/common.sh@33 -- # echo 1024 00:04:03.704 02:51:58 -- setup/common.sh@33 -- # return 0 00:04:03.704 02:51:58 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:03.704 02:51:58 -- setup/hugepages.sh@112 -- # get_nodes 00:04:03.704 02:51:58 -- setup/hugepages.sh@27 -- # local node 00:04:03.704 02:51:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.704 02:51:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:03.704 02:51:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.704 02:51:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:03.704 02:51:58 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:03.704 02:51:58 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:03.704 02:51:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.704 02:51:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.704 02:51:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:03.704 02:51:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.704 02:51:58 -- setup/common.sh@18 -- # local node=0 00:04:03.704 02:51:58 -- setup/common.sh@19 -- # local var val 00:04:03.704 02:51:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.704 02:51:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.704 02:51:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:03.704 02:51:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:03.704 02:51:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.704 02:51:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25365020 kB' 'MemUsed: 7227064 kB' 'SwapCached: 16 kB' 'Active: 3438976 kB' 'Inactive: 180704 kB' 'Active(anon): 3222356 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3376376 kB' 'Mapped: 120524 kB' 'AnonPages: 246564 kB' 'Shmem: 2979052 kB' 'KernelStack: 11720 kB' 'PageTables: 4704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 128024 kB' 'Slab: 372760 kB' 'SReclaimable: 128024 kB' 'SUnreclaim: 244736 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.704 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.704 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # continue 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.705 02:51:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.705 02:51:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.705 02:51:58 -- setup/common.sh@33 -- # echo 0 00:04:03.705 02:51:58 -- setup/common.sh@33 -- # return 0 00:04:03.705 02:51:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.705 02:51:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.705 02:51:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.705 02:51:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.705 02:51:58 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:03.705 node0=1024 expecting 1024 00:04:03.705 02:51:58 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:03.705 00:04:03.705 real 0m5.033s 00:04:03.705 user 0m1.391s 00:04:03.705 sys 0m2.241s 00:04:03.706 02:51:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:03.706 02:51:58 -- common/autotest_common.sh@10 -- # set +x 00:04:03.706 ************************************ 00:04:03.706 END TEST default_setup 00:04:03.706 ************************************ 00:04:03.706 02:51:58 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:03.706 02:51:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:03.706 02:51:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:03.706 02:51:58 -- common/autotest_common.sh@10 -- # set +x 00:04:03.706 ************************************ 00:04:03.706 START TEST per_node_1G_alloc 00:04:03.706 ************************************ 00:04:03.706 02:51:58 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:04:03.706 02:51:58 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:03.706 02:51:58 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:03.706 02:51:58 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:03.706 02:51:58 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:03.706 02:51:58 -- setup/hugepages.sh@51 -- # shift 00:04:03.706 02:51:58 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:03.706 02:51:58 -- setup/hugepages.sh@52 -- # local node_ids 00:04:03.706 02:51:58 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:03.706 02:51:58 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:03.706 02:51:58 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:03.706 02:51:58 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:03.706 02:51:58 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:03.706 02:51:58 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:03.706 02:51:58 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:03.706 02:51:58 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:03.706 02:51:58 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:03.706 02:51:58 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:03.706 02:51:58 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:03.706 02:51:58 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:03.706 02:51:58 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:03.706 02:51:58 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:03.706 02:51:58 -- setup/hugepages.sh@73 -- # return 0 00:04:03.706 02:51:58 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:03.706 02:51:58 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:03.706 02:51:58 -- setup/hugepages.sh@146 -- # setup output 00:04:03.706 02:51:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:03.706 02:51:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:06.995 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:06.995 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:07.257 02:52:02 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:07.257 02:52:02 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:07.257 02:52:02 -- setup/hugepages.sh@89 -- # local node 00:04:07.257 02:52:02 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:07.257 02:52:02 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:07.257 02:52:02 -- setup/hugepages.sh@92 -- # local surp 00:04:07.257 02:52:02 -- setup/hugepages.sh@93 -- # local resv 00:04:07.257 02:52:02 -- setup/hugepages.sh@94 -- # local anon 00:04:07.257 02:52:02 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:07.257 02:52:02 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:07.257 02:52:02 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:07.257 02:52:02 -- setup/common.sh@18 -- # local node= 00:04:07.257 02:52:02 -- setup/common.sh@19 -- # local var val 00:04:07.257 02:52:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.257 02:52:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.257 02:52:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.257 02:52:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.257 02:52:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.258 02:52:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42813344 kB' 'MemAvailable: 45196584 kB' 'Buffers: 12536 kB' 'Cached: 11211816 kB' 'SwapCached: 16 kB' 'Active: 9454676 kB' 'Inactive: 2354388 kB' 'Active(anon): 8979324 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587916 kB' 'Mapped: 185728 kB' 'Shmem: 8451700 kB' 'KReclaimable: 248208 kB' 'Slab: 782580 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534372 kB' 'KernelStack: 21888 kB' 'PageTables: 8072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10399608 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213508 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.258 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.258 02:52:02 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.259 02:52:02 -- setup/common.sh@33 -- # echo 0 00:04:07.259 02:52:02 -- setup/common.sh@33 -- # return 0 00:04:07.259 02:52:02 -- setup/hugepages.sh@97 -- # anon=0 00:04:07.259 02:52:02 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:07.259 02:52:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.259 02:52:02 -- setup/common.sh@18 -- # local node= 00:04:07.259 02:52:02 -- setup/common.sh@19 -- # local var val 00:04:07.259 02:52:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.259 02:52:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.259 02:52:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.259 02:52:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.259 02:52:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.259 02:52:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42814392 kB' 'MemAvailable: 45197632 kB' 'Buffers: 12536 kB' 'Cached: 11211824 kB' 'SwapCached: 16 kB' 'Active: 9454908 kB' 'Inactive: 2354388 kB' 'Active(anon): 8979556 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588172 kB' 'Mapped: 185676 kB' 'Shmem: 8451708 kB' 'KReclaimable: 248208 kB' 'Slab: 782604 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534396 kB' 'KernelStack: 21888 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10399756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.259 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.259 02:52:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.260 02:52:02 -- setup/common.sh@33 -- # echo 0 00:04:07.260 02:52:02 -- setup/common.sh@33 -- # return 0 00:04:07.260 02:52:02 -- setup/hugepages.sh@99 -- # surp=0 00:04:07.260 02:52:02 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:07.260 02:52:02 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:07.260 02:52:02 -- setup/common.sh@18 -- # local node= 00:04:07.260 02:52:02 -- setup/common.sh@19 -- # local var val 00:04:07.260 02:52:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.260 02:52:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.260 02:52:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.260 02:52:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.260 02:52:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.260 02:52:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42814392 kB' 'MemAvailable: 45197632 kB' 'Buffers: 12536 kB' 'Cached: 11211824 kB' 'SwapCached: 16 kB' 'Active: 9455132 kB' 'Inactive: 2354388 kB' 'Active(anon): 8979780 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588456 kB' 'Mapped: 185680 kB' 'Shmem: 8451708 kB' 'KReclaimable: 248208 kB' 'Slab: 782604 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534396 kB' 'KernelStack: 21904 kB' 'PageTables: 8136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10400136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213492 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.260 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.260 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.261 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.261 02:52:02 -- setup/common.sh@33 -- # echo 0 00:04:07.261 02:52:02 -- setup/common.sh@33 -- # return 0 00:04:07.261 02:52:02 -- setup/hugepages.sh@100 -- # resv=0 00:04:07.261 02:52:02 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:07.261 nr_hugepages=1024 00:04:07.261 02:52:02 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:07.261 resv_hugepages=0 00:04:07.261 02:52:02 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:07.261 surplus_hugepages=0 00:04:07.261 02:52:02 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:07.261 anon_hugepages=0 00:04:07.261 02:52:02 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.261 02:52:02 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:07.261 02:52:02 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:07.261 02:52:02 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:07.261 02:52:02 -- setup/common.sh@18 -- # local node= 00:04:07.261 02:52:02 -- setup/common.sh@19 -- # local var val 00:04:07.261 02:52:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.261 02:52:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.261 02:52:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.261 02:52:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.261 02:52:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.261 02:52:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.261 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42814392 kB' 'MemAvailable: 45197632 kB' 'Buffers: 12536 kB' 'Cached: 11211824 kB' 'SwapCached: 16 kB' 'Active: 9455132 kB' 'Inactive: 2354388 kB' 'Active(anon): 8979780 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588456 kB' 'Mapped: 185680 kB' 'Shmem: 8451708 kB' 'KReclaimable: 248208 kB' 'Slab: 782604 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534396 kB' 'KernelStack: 21904 kB' 'PageTables: 8136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10400152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213492 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.262 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.262 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.263 02:52:02 -- setup/common.sh@33 -- # echo 1024 00:04:07.263 02:52:02 -- setup/common.sh@33 -- # return 0 00:04:07.263 02:52:02 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.263 02:52:02 -- setup/hugepages.sh@112 -- # get_nodes 00:04:07.263 02:52:02 -- setup/hugepages.sh@27 -- # local node 00:04:07.263 02:52:02 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.263 02:52:02 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:07.263 02:52:02 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.263 02:52:02 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:07.263 02:52:02 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:07.263 02:52:02 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:07.263 02:52:02 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.263 02:52:02 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.263 02:52:02 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:07.263 02:52:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.263 02:52:02 -- setup/common.sh@18 -- # local node=0 00:04:07.263 02:52:02 -- setup/common.sh@19 -- # local var val 00:04:07.263 02:52:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.263 02:52:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.263 02:52:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:07.263 02:52:02 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:07.263 02:52:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.263 02:52:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.263 02:52:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26422864 kB' 'MemUsed: 6169220 kB' 'SwapCached: 16 kB' 'Active: 3438988 kB' 'Inactive: 180704 kB' 'Active(anon): 3222368 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3376464 kB' 'Mapped: 120244 kB' 'AnonPages: 246460 kB' 'Shmem: 2979140 kB' 'KernelStack: 11720 kB' 'PageTables: 4696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 128024 kB' 'Slab: 373644 kB' 'SReclaimable: 128024 kB' 'SUnreclaim: 245620 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.263 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.263 02:52:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@33 -- # echo 0 00:04:07.264 02:52:02 -- setup/common.sh@33 -- # return 0 00:04:07.264 02:52:02 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.264 02:52:02 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.264 02:52:02 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.264 02:52:02 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:07.264 02:52:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.264 02:52:02 -- setup/common.sh@18 -- # local node=1 00:04:07.264 02:52:02 -- setup/common.sh@19 -- # local var val 00:04:07.264 02:52:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.264 02:52:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.264 02:52:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:07.264 02:52:02 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:07.264 02:52:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.264 02:52:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16392084 kB' 'MemUsed: 11311064 kB' 'SwapCached: 0 kB' 'Active: 6016088 kB' 'Inactive: 2173684 kB' 'Active(anon): 5757356 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7847952 kB' 'Mapped: 65436 kB' 'AnonPages: 341896 kB' 'Shmem: 5472608 kB' 'KernelStack: 10136 kB' 'PageTables: 3284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 120184 kB' 'Slab: 408960 kB' 'SReclaimable: 120184 kB' 'SUnreclaim: 288776 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.264 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.264 02:52:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # continue 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.265 02:52:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.265 02:52:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.265 02:52:02 -- setup/common.sh@33 -- # echo 0 00:04:07.265 02:52:02 -- setup/common.sh@33 -- # return 0 00:04:07.265 02:52:02 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.265 02:52:02 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.265 02:52:02 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.265 02:52:02 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.265 02:52:02 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:07.265 node0=512 expecting 512 00:04:07.265 02:52:02 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.265 02:52:02 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.265 02:52:02 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.265 02:52:02 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:07.265 node1=512 expecting 512 00:04:07.265 02:52:02 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:07.265 00:04:07.265 real 0m3.581s 00:04:07.265 user 0m1.336s 00:04:07.265 sys 0m2.312s 00:04:07.265 02:52:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:07.265 02:52:02 -- common/autotest_common.sh@10 -- # set +x 00:04:07.265 ************************************ 00:04:07.265 END TEST per_node_1G_alloc 00:04:07.265 ************************************ 00:04:07.265 02:52:02 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:07.265 02:52:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:07.265 02:52:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:07.265 02:52:02 -- common/autotest_common.sh@10 -- # set +x 00:04:07.265 ************************************ 00:04:07.265 START TEST even_2G_alloc 00:04:07.265 ************************************ 00:04:07.265 02:52:02 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:04:07.265 02:52:02 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:07.265 02:52:02 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:07.265 02:52:02 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:07.265 02:52:02 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:07.265 02:52:02 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:07.265 02:52:02 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:07.265 02:52:02 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:07.265 02:52:02 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:07.265 02:52:02 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:07.265 02:52:02 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:07.265 02:52:02 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:07.265 02:52:02 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:07.265 02:52:02 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:07.265 02:52:02 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:07.265 02:52:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.265 02:52:02 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:07.265 02:52:02 -- setup/hugepages.sh@83 -- # : 512 00:04:07.265 02:52:02 -- setup/hugepages.sh@84 -- # : 1 00:04:07.265 02:52:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.265 02:52:02 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:07.265 02:52:02 -- setup/hugepages.sh@83 -- # : 0 00:04:07.265 02:52:02 -- setup/hugepages.sh@84 -- # : 0 00:04:07.265 02:52:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.265 02:52:02 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:07.265 02:52:02 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:07.265 02:52:02 -- setup/hugepages.sh@153 -- # setup output 00:04:07.265 02:52:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:07.265 02:52:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:11.458 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:11.458 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:11.458 02:52:05 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:11.458 02:52:05 -- setup/hugepages.sh@89 -- # local node 00:04:11.458 02:52:05 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:11.458 02:52:05 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:11.458 02:52:05 -- setup/hugepages.sh@92 -- # local surp 00:04:11.458 02:52:05 -- setup/hugepages.sh@93 -- # local resv 00:04:11.458 02:52:05 -- setup/hugepages.sh@94 -- # local anon 00:04:11.458 02:52:05 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:11.458 02:52:05 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:11.458 02:52:06 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:11.458 02:52:06 -- setup/common.sh@18 -- # local node= 00:04:11.458 02:52:06 -- setup/common.sh@19 -- # local var val 00:04:11.458 02:52:06 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.458 02:52:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.458 02:52:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.458 02:52:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.458 02:52:06 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.458 02:52:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42903448 kB' 'MemAvailable: 45286688 kB' 'Buffers: 12536 kB' 'Cached: 11211956 kB' 'SwapCached: 16 kB' 'Active: 9455580 kB' 'Inactive: 2354388 kB' 'Active(anon): 8980228 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588820 kB' 'Mapped: 186096 kB' 'Shmem: 8451840 kB' 'KReclaimable: 248208 kB' 'Slab: 782824 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534616 kB' 'KernelStack: 21904 kB' 'PageTables: 8152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10401656 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213444 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.458 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.458 02:52:06 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.459 02:52:06 -- setup/common.sh@33 -- # echo 0 00:04:11.459 02:52:06 -- setup/common.sh@33 -- # return 0 00:04:11.459 02:52:06 -- setup/hugepages.sh@97 -- # anon=0 00:04:11.459 02:52:06 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:11.459 02:52:06 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.459 02:52:06 -- setup/common.sh@18 -- # local node= 00:04:11.459 02:52:06 -- setup/common.sh@19 -- # local var val 00:04:11.459 02:52:06 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.459 02:52:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.459 02:52:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.459 02:52:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.459 02:52:06 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.459 02:52:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42908140 kB' 'MemAvailable: 45291380 kB' 'Buffers: 12536 kB' 'Cached: 11211956 kB' 'SwapCached: 16 kB' 'Active: 9455788 kB' 'Inactive: 2354388 kB' 'Active(anon): 8980436 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589044 kB' 'Mapped: 185684 kB' 'Shmem: 8451840 kB' 'KReclaimable: 248208 kB' 'Slab: 782872 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534664 kB' 'KernelStack: 21904 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10400780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213396 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.459 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.459 02:52:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.460 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.460 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.461 02:52:06 -- setup/common.sh@33 -- # echo 0 00:04:11.461 02:52:06 -- setup/common.sh@33 -- # return 0 00:04:11.461 02:52:06 -- setup/hugepages.sh@99 -- # surp=0 00:04:11.461 02:52:06 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:11.461 02:52:06 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:11.461 02:52:06 -- setup/common.sh@18 -- # local node= 00:04:11.461 02:52:06 -- setup/common.sh@19 -- # local var val 00:04:11.461 02:52:06 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.461 02:52:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.461 02:52:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.461 02:52:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.461 02:52:06 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.461 02:52:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42906520 kB' 'MemAvailable: 45289760 kB' 'Buffers: 12536 kB' 'Cached: 11211968 kB' 'SwapCached: 16 kB' 'Active: 9456348 kB' 'Inactive: 2354388 kB' 'Active(anon): 8980996 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589600 kB' 'Mapped: 185684 kB' 'Shmem: 8451852 kB' 'KReclaimable: 248208 kB' 'Slab: 782872 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534664 kB' 'KernelStack: 21936 kB' 'PageTables: 8232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10405192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213428 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.461 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.461 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.462 02:52:06 -- setup/common.sh@33 -- # echo 0 00:04:11.462 02:52:06 -- setup/common.sh@33 -- # return 0 00:04:11.462 02:52:06 -- setup/hugepages.sh@100 -- # resv=0 00:04:11.462 02:52:06 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:11.462 nr_hugepages=1024 00:04:11.462 02:52:06 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:11.462 resv_hugepages=0 00:04:11.462 02:52:06 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:11.462 surplus_hugepages=0 00:04:11.462 02:52:06 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:11.462 anon_hugepages=0 00:04:11.462 02:52:06 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:11.462 02:52:06 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:11.462 02:52:06 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:11.462 02:52:06 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:11.462 02:52:06 -- setup/common.sh@18 -- # local node= 00:04:11.462 02:52:06 -- setup/common.sh@19 -- # local var val 00:04:11.462 02:52:06 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.462 02:52:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.462 02:52:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.462 02:52:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.462 02:52:06 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.462 02:52:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42908400 kB' 'MemAvailable: 45291640 kB' 'Buffers: 12536 kB' 'Cached: 11211988 kB' 'SwapCached: 16 kB' 'Active: 9456524 kB' 'Inactive: 2354388 kB' 'Active(anon): 8981172 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589744 kB' 'Mapped: 185684 kB' 'Shmem: 8451872 kB' 'KReclaimable: 248208 kB' 'Slab: 782872 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534664 kB' 'KernelStack: 21984 kB' 'PageTables: 8008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10403708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213428 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.462 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.462 02:52:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.463 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.463 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.463 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.463 02:52:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.463 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.463 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.463 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.463 02:52:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.463 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.463 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.463 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.464 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.464 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.464 02:52:06 -- setup/common.sh@33 -- # echo 1024 00:04:11.464 02:52:06 -- setup/common.sh@33 -- # return 0 00:04:11.464 02:52:06 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:11.464 02:52:06 -- setup/hugepages.sh@112 -- # get_nodes 00:04:11.464 02:52:06 -- setup/hugepages.sh@27 -- # local node 00:04:11.465 02:52:06 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.465 02:52:06 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:11.465 02:52:06 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.465 02:52:06 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:11.465 02:52:06 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:11.465 02:52:06 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:11.465 02:52:06 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:11.465 02:52:06 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:11.465 02:52:06 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:11.465 02:52:06 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.465 02:52:06 -- setup/common.sh@18 -- # local node=0 00:04:11.465 02:52:06 -- setup/common.sh@19 -- # local var val 00:04:11.465 02:52:06 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.465 02:52:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.465 02:52:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:11.465 02:52:06 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:11.465 02:52:06 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.465 02:52:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26486660 kB' 'MemUsed: 6105424 kB' 'SwapCached: 16 kB' 'Active: 3440384 kB' 'Inactive: 180704 kB' 'Active(anon): 3223764 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3376540 kB' 'Mapped: 120252 kB' 'AnonPages: 247892 kB' 'Shmem: 2979216 kB' 'KernelStack: 11896 kB' 'PageTables: 5160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 128024 kB' 'Slab: 373736 kB' 'SReclaimable: 128024 kB' 'SUnreclaim: 245712 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.465 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.465 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@33 -- # echo 0 00:04:11.466 02:52:06 -- setup/common.sh@33 -- # return 0 00:04:11.466 02:52:06 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:11.466 02:52:06 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:11.466 02:52:06 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:11.466 02:52:06 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:11.466 02:52:06 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.466 02:52:06 -- setup/common.sh@18 -- # local node=1 00:04:11.466 02:52:06 -- setup/common.sh@19 -- # local var val 00:04:11.466 02:52:06 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.466 02:52:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.466 02:52:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:11.466 02:52:06 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:11.466 02:52:06 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.466 02:52:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16427460 kB' 'MemUsed: 11275688 kB' 'SwapCached: 0 kB' 'Active: 6016504 kB' 'Inactive: 2173684 kB' 'Active(anon): 5757772 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7848016 kB' 'Mapped: 65440 kB' 'AnonPages: 342244 kB' 'Shmem: 5472672 kB' 'KernelStack: 10136 kB' 'PageTables: 3292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 120184 kB' 'Slab: 409136 kB' 'SReclaimable: 120184 kB' 'SUnreclaim: 288952 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.466 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.466 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # continue 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.467 02:52:06 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.467 02:52:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.467 02:52:06 -- setup/common.sh@33 -- # echo 0 00:04:11.467 02:52:06 -- setup/common.sh@33 -- # return 0 00:04:11.467 02:52:06 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:11.467 02:52:06 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:11.467 02:52:06 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:11.467 02:52:06 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:11.467 02:52:06 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:11.467 node0=512 expecting 512 00:04:11.467 02:52:06 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:11.467 02:52:06 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:11.467 02:52:06 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:11.467 02:52:06 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:11.467 node1=512 expecting 512 00:04:11.467 02:52:06 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:11.467 00:04:11.467 real 0m3.751s 00:04:11.467 user 0m1.347s 00:04:11.467 sys 0m2.473s 00:04:11.467 02:52:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:11.467 02:52:06 -- common/autotest_common.sh@10 -- # set +x 00:04:11.467 ************************************ 00:04:11.467 END TEST even_2G_alloc 00:04:11.467 ************************************ 00:04:11.467 02:52:06 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:11.467 02:52:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:11.467 02:52:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:11.467 02:52:06 -- common/autotest_common.sh@10 -- # set +x 00:04:11.467 ************************************ 00:04:11.467 START TEST odd_alloc 00:04:11.467 ************************************ 00:04:11.467 02:52:06 -- common/autotest_common.sh@1104 -- # odd_alloc 00:04:11.467 02:52:06 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:11.467 02:52:06 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:11.467 02:52:06 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:11.467 02:52:06 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:11.467 02:52:06 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:11.467 02:52:06 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:11.467 02:52:06 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:11.467 02:52:06 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:11.467 02:52:06 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:11.467 02:52:06 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:11.467 02:52:06 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:11.467 02:52:06 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:11.467 02:52:06 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:11.467 02:52:06 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:11.467 02:52:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:11.467 02:52:06 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:11.467 02:52:06 -- setup/hugepages.sh@83 -- # : 513 00:04:11.467 02:52:06 -- setup/hugepages.sh@84 -- # : 1 00:04:11.467 02:52:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:11.467 02:52:06 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:11.467 02:52:06 -- setup/hugepages.sh@83 -- # : 0 00:04:11.467 02:52:06 -- setup/hugepages.sh@84 -- # : 0 00:04:11.467 02:52:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:11.467 02:52:06 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:11.467 02:52:06 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:11.467 02:52:06 -- setup/hugepages.sh@160 -- # setup output 00:04:11.467 02:52:06 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.467 02:52:06 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:14.763 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.763 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:14.763 02:52:09 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:14.763 02:52:09 -- setup/hugepages.sh@89 -- # local node 00:04:14.763 02:52:09 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:14.763 02:52:09 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:14.763 02:52:09 -- setup/hugepages.sh@92 -- # local surp 00:04:14.763 02:52:09 -- setup/hugepages.sh@93 -- # local resv 00:04:14.763 02:52:09 -- setup/hugepages.sh@94 -- # local anon 00:04:14.763 02:52:09 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:14.763 02:52:09 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:14.763 02:52:09 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:14.763 02:52:09 -- setup/common.sh@18 -- # local node= 00:04:14.763 02:52:09 -- setup/common.sh@19 -- # local var val 00:04:14.763 02:52:09 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.763 02:52:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.763 02:52:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.763 02:52:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.763 02:52:09 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.763 02:52:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.763 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 02:52:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42919720 kB' 'MemAvailable: 45302960 kB' 'Buffers: 12536 kB' 'Cached: 11212084 kB' 'SwapCached: 16 kB' 'Active: 9458004 kB' 'Inactive: 2354388 kB' 'Active(anon): 8982652 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590968 kB' 'Mapped: 185380 kB' 'Shmem: 8451968 kB' 'KReclaimable: 248208 kB' 'Slab: 782204 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 533996 kB' 'KernelStack: 22304 kB' 'PageTables: 8996 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 10406132 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213716 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:14.763 02:52:09 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.763 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 02:52:09 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.763 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 02:52:09 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.763 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 02:52:09 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 02:52:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.764 02:52:09 -- setup/common.sh@33 -- # echo 0 00:04:14.764 02:52:09 -- setup/common.sh@33 -- # return 0 00:04:14.764 02:52:09 -- setup/hugepages.sh@97 -- # anon=0 00:04:14.764 02:52:09 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:14.764 02:52:09 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.764 02:52:09 -- setup/common.sh@18 -- # local node= 00:04:14.765 02:52:09 -- setup/common.sh@19 -- # local var val 00:04:14.765 02:52:09 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.765 02:52:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.765 02:52:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.765 02:52:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.765 02:52:09 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.765 02:52:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42919120 kB' 'MemAvailable: 45302360 kB' 'Buffers: 12536 kB' 'Cached: 11212084 kB' 'SwapCached: 16 kB' 'Active: 9457872 kB' 'Inactive: 2354388 kB' 'Active(anon): 8982520 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590836 kB' 'Mapped: 185340 kB' 'Shmem: 8451968 kB' 'KReclaimable: 248208 kB' 'Slab: 782204 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 533996 kB' 'KernelStack: 22336 kB' 'PageTables: 8884 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 10402760 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213636 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.765 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.766 02:52:09 -- setup/common.sh@33 -- # echo 0 00:04:14.766 02:52:09 -- setup/common.sh@33 -- # return 0 00:04:14.766 02:52:09 -- setup/hugepages.sh@99 -- # surp=0 00:04:14.766 02:52:09 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:14.766 02:52:09 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:14.766 02:52:09 -- setup/common.sh@18 -- # local node= 00:04:14.766 02:52:09 -- setup/common.sh@19 -- # local var val 00:04:14.766 02:52:09 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.766 02:52:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.766 02:52:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.766 02:52:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.766 02:52:09 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.766 02:52:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42921392 kB' 'MemAvailable: 45304632 kB' 'Buffers: 12536 kB' 'Cached: 11212084 kB' 'SwapCached: 16 kB' 'Active: 9456904 kB' 'Inactive: 2354388 kB' 'Active(anon): 8981552 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589868 kB' 'Mapped: 185340 kB' 'Shmem: 8451968 kB' 'KReclaimable: 248208 kB' 'Slab: 782204 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 533996 kB' 'KernelStack: 22048 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 10401752 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213540 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 02:52:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.767 02:52:09 -- setup/common.sh@33 -- # echo 0 00:04:14.767 02:52:09 -- setup/common.sh@33 -- # return 0 00:04:14.767 02:52:09 -- setup/hugepages.sh@100 -- # resv=0 00:04:14.767 02:52:09 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:14.767 nr_hugepages=1025 00:04:14.767 02:52:09 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:14.767 resv_hugepages=0 00:04:14.767 02:52:09 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:14.767 surplus_hugepages=0 00:04:14.767 02:52:09 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:14.767 anon_hugepages=0 00:04:14.767 02:52:09 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:14.767 02:52:09 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:14.767 02:52:09 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:14.767 02:52:09 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:14.767 02:52:09 -- setup/common.sh@18 -- # local node= 00:04:14.767 02:52:09 -- setup/common.sh@19 -- # local var val 00:04:14.767 02:52:09 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.767 02:52:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.767 02:52:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.767 02:52:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.767 02:52:09 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.767 02:52:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.768 02:52:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42920720 kB' 'MemAvailable: 45303960 kB' 'Buffers: 12536 kB' 'Cached: 11212124 kB' 'SwapCached: 16 kB' 'Active: 9456456 kB' 'Inactive: 2354388 kB' 'Active(anon): 8981104 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589412 kB' 'Mapped: 185340 kB' 'Shmem: 8452008 kB' 'KReclaimable: 248208 kB' 'Slab: 782464 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534256 kB' 'KernelStack: 21984 kB' 'PageTables: 8300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 10401764 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213540 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.768 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.768 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.769 02:52:09 -- setup/common.sh@33 -- # echo 1025 00:04:14.769 02:52:09 -- setup/common.sh@33 -- # return 0 00:04:14.769 02:52:09 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:14.769 02:52:09 -- setup/hugepages.sh@112 -- # get_nodes 00:04:14.769 02:52:09 -- setup/hugepages.sh@27 -- # local node 00:04:14.769 02:52:09 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.769 02:52:09 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:14.769 02:52:09 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.769 02:52:09 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:14.769 02:52:09 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:14.769 02:52:09 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:14.769 02:52:09 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.769 02:52:09 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.769 02:52:09 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:14.769 02:52:09 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.769 02:52:09 -- setup/common.sh@18 -- # local node=0 00:04:14.769 02:52:09 -- setup/common.sh@19 -- # local var val 00:04:14.769 02:52:09 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.769 02:52:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.769 02:52:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:14.769 02:52:09 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:14.769 02:52:09 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.769 02:52:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26474076 kB' 'MemUsed: 6118008 kB' 'SwapCached: 16 kB' 'Active: 3440268 kB' 'Inactive: 180704 kB' 'Active(anon): 3223648 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3376580 kB' 'Mapped: 120060 kB' 'AnonPages: 247516 kB' 'Shmem: 2979256 kB' 'KernelStack: 11848 kB' 'PageTables: 5060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 128024 kB' 'Slab: 373392 kB' 'SReclaimable: 128024 kB' 'SUnreclaim: 245368 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.769 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.769 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@33 -- # echo 0 00:04:14.770 02:52:09 -- setup/common.sh@33 -- # return 0 00:04:14.770 02:52:09 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:14.770 02:52:09 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.770 02:52:09 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.770 02:52:09 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:14.770 02:52:09 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.770 02:52:09 -- setup/common.sh@18 -- # local node=1 00:04:14.770 02:52:09 -- setup/common.sh@19 -- # local var val 00:04:14.770 02:52:09 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.770 02:52:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.770 02:52:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:14.770 02:52:09 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:14.770 02:52:09 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.770 02:52:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16446392 kB' 'MemUsed: 11256756 kB' 'SwapCached: 0 kB' 'Active: 6016956 kB' 'Inactive: 2173684 kB' 'Active(anon): 5758224 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7848112 kB' 'Mapped: 65280 kB' 'AnonPages: 342668 kB' 'Shmem: 5472768 kB' 'KernelStack: 10168 kB' 'PageTables: 3348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 120184 kB' 'Slab: 409072 kB' 'SReclaimable: 120184 kB' 'SUnreclaim: 288888 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.770 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.770 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # continue 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.771 02:52:09 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.771 02:52:09 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.771 02:52:09 -- setup/common.sh@33 -- # echo 0 00:04:14.771 02:52:09 -- setup/common.sh@33 -- # return 0 00:04:14.771 02:52:09 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:14.771 02:52:09 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:14.771 02:52:09 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:14.771 02:52:09 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:14.771 node0=512 expecting 513 00:04:14.771 02:52:09 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:14.771 02:52:09 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:14.771 02:52:09 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:14.771 02:52:09 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:14.771 node1=513 expecting 512 00:04:14.771 02:52:09 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:14.771 00:04:14.771 real 0m3.479s 00:04:14.771 user 0m1.280s 00:04:14.771 sys 0m2.244s 00:04:14.771 02:52:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:14.771 02:52:09 -- common/autotest_common.sh@10 -- # set +x 00:04:14.771 ************************************ 00:04:14.771 END TEST odd_alloc 00:04:14.771 ************************************ 00:04:14.771 02:52:09 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:14.771 02:52:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:14.771 02:52:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:14.771 02:52:09 -- common/autotest_common.sh@10 -- # set +x 00:04:14.771 ************************************ 00:04:14.771 START TEST custom_alloc 00:04:14.771 ************************************ 00:04:14.771 02:52:09 -- common/autotest_common.sh@1104 -- # custom_alloc 00:04:14.771 02:52:09 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:14.771 02:52:09 -- setup/hugepages.sh@169 -- # local node 00:04:14.771 02:52:09 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:14.771 02:52:09 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:14.771 02:52:09 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:14.771 02:52:09 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:14.771 02:52:09 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:14.771 02:52:09 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:14.771 02:52:09 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:14.771 02:52:09 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:14.771 02:52:09 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:14.771 02:52:09 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:14.771 02:52:09 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:14.771 02:52:09 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:14.771 02:52:09 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:14.771 02:52:09 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:14.771 02:52:09 -- setup/hugepages.sh@83 -- # : 256 00:04:14.771 02:52:09 -- setup/hugepages.sh@84 -- # : 1 00:04:14.771 02:52:09 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:14.771 02:52:09 -- setup/hugepages.sh@83 -- # : 0 00:04:14.771 02:52:09 -- setup/hugepages.sh@84 -- # : 0 00:04:14.771 02:52:09 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:14.771 02:52:09 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:14.771 02:52:09 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:14.771 02:52:09 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:14.771 02:52:09 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:14.771 02:52:09 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:14.771 02:52:09 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:14.771 02:52:09 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:14.771 02:52:09 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:14.771 02:52:09 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:14.771 02:52:09 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:14.771 02:52:09 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:14.771 02:52:09 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:14.771 02:52:09 -- setup/hugepages.sh@78 -- # return 0 00:04:14.771 02:52:09 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:14.771 02:52:09 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:14.771 02:52:09 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:14.771 02:52:09 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:14.771 02:52:09 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:14.771 02:52:09 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:14.771 02:52:09 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:14.771 02:52:09 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:14.771 02:52:09 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:14.771 02:52:09 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:14.771 02:52:09 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:14.771 02:52:09 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:14.771 02:52:09 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:14.771 02:52:09 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:14.771 02:52:09 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:14.771 02:52:09 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:14.771 02:52:09 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:14.771 02:52:09 -- setup/hugepages.sh@78 -- # return 0 00:04:14.771 02:52:09 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:14.771 02:52:09 -- setup/hugepages.sh@187 -- # setup output 00:04:14.771 02:52:09 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:14.772 02:52:09 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:18.067 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:18.067 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:18.067 02:52:12 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:18.067 02:52:12 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:18.067 02:52:12 -- setup/hugepages.sh@89 -- # local node 00:04:18.067 02:52:12 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:18.068 02:52:12 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:18.068 02:52:12 -- setup/hugepages.sh@92 -- # local surp 00:04:18.068 02:52:12 -- setup/hugepages.sh@93 -- # local resv 00:04:18.068 02:52:12 -- setup/hugepages.sh@94 -- # local anon 00:04:18.068 02:52:12 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:18.068 02:52:12 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:18.068 02:52:12 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:18.068 02:52:12 -- setup/common.sh@18 -- # local node= 00:04:18.068 02:52:12 -- setup/common.sh@19 -- # local var val 00:04:18.068 02:52:12 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.068 02:52:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.068 02:52:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.068 02:52:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.068 02:52:12 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.068 02:52:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41877220 kB' 'MemAvailable: 44260460 kB' 'Buffers: 12536 kB' 'Cached: 11212224 kB' 'SwapCached: 16 kB' 'Active: 9458340 kB' 'Inactive: 2354388 kB' 'Active(anon): 8982988 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591176 kB' 'Mapped: 186316 kB' 'Shmem: 8452108 kB' 'KReclaimable: 248208 kB' 'Slab: 782824 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534616 kB' 'KernelStack: 21952 kB' 'PageTables: 8176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 10403868 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213572 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.068 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.068 02:52:12 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.069 02:52:12 -- setup/common.sh@33 -- # echo 0 00:04:18.069 02:52:12 -- setup/common.sh@33 -- # return 0 00:04:18.069 02:52:12 -- setup/hugepages.sh@97 -- # anon=0 00:04:18.069 02:52:12 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:18.069 02:52:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.069 02:52:12 -- setup/common.sh@18 -- # local node= 00:04:18.069 02:52:12 -- setup/common.sh@19 -- # local var val 00:04:18.069 02:52:12 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.069 02:52:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.069 02:52:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.069 02:52:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.069 02:52:12 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.069 02:52:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41878280 kB' 'MemAvailable: 44261520 kB' 'Buffers: 12536 kB' 'Cached: 11212224 kB' 'SwapCached: 16 kB' 'Active: 9461560 kB' 'Inactive: 2354388 kB' 'Active(anon): 8986208 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594956 kB' 'Mapped: 186284 kB' 'Shmem: 8452108 kB' 'KReclaimable: 248208 kB' 'Slab: 782776 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534568 kB' 'KernelStack: 21936 kB' 'PageTables: 8120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 10407576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213524 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.069 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.069 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.070 02:52:12 -- setup/common.sh@33 -- # echo 0 00:04:18.070 02:52:12 -- setup/common.sh@33 -- # return 0 00:04:18.070 02:52:12 -- setup/hugepages.sh@99 -- # surp=0 00:04:18.070 02:52:12 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:18.070 02:52:12 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:18.070 02:52:12 -- setup/common.sh@18 -- # local node= 00:04:18.070 02:52:12 -- setup/common.sh@19 -- # local var val 00:04:18.070 02:52:12 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.070 02:52:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.070 02:52:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.070 02:52:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.070 02:52:12 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.070 02:52:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41877524 kB' 'MemAvailable: 44260764 kB' 'Buffers: 12536 kB' 'Cached: 11212236 kB' 'SwapCached: 16 kB' 'Active: 9457532 kB' 'Inactive: 2354388 kB' 'Active(anon): 8982180 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590444 kB' 'Mapped: 186032 kB' 'Shmem: 8452120 kB' 'KReclaimable: 248208 kB' 'Slab: 782828 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534620 kB' 'KernelStack: 21936 kB' 'PageTables: 8156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 10402536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213524 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.070 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.070 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.071 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.071 02:52:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.071 02:52:12 -- setup/common.sh@33 -- # echo 0 00:04:18.071 02:52:12 -- setup/common.sh@33 -- # return 0 00:04:18.071 02:52:12 -- setup/hugepages.sh@100 -- # resv=0 00:04:18.071 02:52:12 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:18.071 nr_hugepages=1536 00:04:18.071 02:52:12 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:18.071 resv_hugepages=0 00:04:18.071 02:52:12 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:18.071 surplus_hugepages=0 00:04:18.071 02:52:12 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:18.071 anon_hugepages=0 00:04:18.071 02:52:12 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:18.071 02:52:12 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:18.071 02:52:12 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:18.071 02:52:12 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:18.071 02:52:12 -- setup/common.sh@18 -- # local node= 00:04:18.071 02:52:12 -- setup/common.sh@19 -- # local var val 00:04:18.071 02:52:12 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.071 02:52:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.071 02:52:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.071 02:52:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.071 02:52:12 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.072 02:52:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41876784 kB' 'MemAvailable: 44260024 kB' 'Buffers: 12536 kB' 'Cached: 11212264 kB' 'SwapCached: 16 kB' 'Active: 9457116 kB' 'Inactive: 2354388 kB' 'Active(anon): 8981764 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589944 kB' 'Mapped: 185696 kB' 'Shmem: 8452148 kB' 'KReclaimable: 248208 kB' 'Slab: 782828 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534620 kB' 'KernelStack: 21920 kB' 'PageTables: 8080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 10402552 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213524 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.072 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.072 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:12 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:12 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.073 02:52:13 -- setup/common.sh@33 -- # echo 1536 00:04:18.073 02:52:13 -- setup/common.sh@33 -- # return 0 00:04:18.073 02:52:13 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:18.073 02:52:13 -- setup/hugepages.sh@112 -- # get_nodes 00:04:18.073 02:52:13 -- setup/hugepages.sh@27 -- # local node 00:04:18.073 02:52:13 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:18.073 02:52:13 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:18.073 02:52:13 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:18.073 02:52:13 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:18.073 02:52:13 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:18.073 02:52:13 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:18.073 02:52:13 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:18.073 02:52:13 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:18.073 02:52:13 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:18.073 02:52:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.073 02:52:13 -- setup/common.sh@18 -- # local node=0 00:04:18.073 02:52:13 -- setup/common.sh@19 -- # local var val 00:04:18.073 02:52:13 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.073 02:52:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.073 02:52:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:18.073 02:52:13 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:18.073 02:52:13 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.073 02:52:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26477192 kB' 'MemUsed: 6114892 kB' 'SwapCached: 16 kB' 'Active: 3439384 kB' 'Inactive: 180704 kB' 'Active(anon): 3222764 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3376592 kB' 'Mapped: 120264 kB' 'AnonPages: 246608 kB' 'Shmem: 2979268 kB' 'KernelStack: 11720 kB' 'PageTables: 4704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 128024 kB' 'Slab: 373396 kB' 'SReclaimable: 128024 kB' 'SUnreclaim: 245372 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.073 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.073 02:52:13 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@33 -- # echo 0 00:04:18.074 02:52:13 -- setup/common.sh@33 -- # return 0 00:04:18.074 02:52:13 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:18.074 02:52:13 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:18.074 02:52:13 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:18.074 02:52:13 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:18.074 02:52:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.074 02:52:13 -- setup/common.sh@18 -- # local node=1 00:04:18.074 02:52:13 -- setup/common.sh@19 -- # local var val 00:04:18.074 02:52:13 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.074 02:52:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.074 02:52:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:18.074 02:52:13 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:18.074 02:52:13 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.074 02:52:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 15400864 kB' 'MemUsed: 12302284 kB' 'SwapCached: 0 kB' 'Active: 6018116 kB' 'Inactive: 2173684 kB' 'Active(anon): 5759384 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7848240 kB' 'Mapped: 65432 kB' 'AnonPages: 343732 kB' 'Shmem: 5472896 kB' 'KernelStack: 10216 kB' 'PageTables: 3432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 120184 kB' 'Slab: 409432 kB' 'SReclaimable: 120184 kB' 'SUnreclaim: 289248 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.074 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.074 02:52:13 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # continue 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.075 02:52:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.075 02:52:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.075 02:52:13 -- setup/common.sh@33 -- # echo 0 00:04:18.075 02:52:13 -- setup/common.sh@33 -- # return 0 00:04:18.075 02:52:13 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:18.075 02:52:13 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:18.075 02:52:13 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:18.075 02:52:13 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:18.075 02:52:13 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:18.075 node0=512 expecting 512 00:04:18.075 02:52:13 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:18.075 02:52:13 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:18.075 02:52:13 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:18.075 02:52:13 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:18.075 node1=1024 expecting 1024 00:04:18.075 02:52:13 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:18.075 00:04:18.075 real 0m3.304s 00:04:18.075 user 0m1.188s 00:04:18.075 sys 0m2.135s 00:04:18.075 02:52:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:18.075 02:52:13 -- common/autotest_common.sh@10 -- # set +x 00:04:18.075 ************************************ 00:04:18.075 END TEST custom_alloc 00:04:18.075 ************************************ 00:04:18.075 02:52:13 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:18.075 02:52:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:18.075 02:52:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:18.075 02:52:13 -- common/autotest_common.sh@10 -- # set +x 00:04:18.075 ************************************ 00:04:18.075 START TEST no_shrink_alloc 00:04:18.075 ************************************ 00:04:18.075 02:52:13 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:04:18.075 02:52:13 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:18.075 02:52:13 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:18.075 02:52:13 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:18.075 02:52:13 -- setup/hugepages.sh@51 -- # shift 00:04:18.075 02:52:13 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:18.075 02:52:13 -- setup/hugepages.sh@52 -- # local node_ids 00:04:18.075 02:52:13 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:18.075 02:52:13 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:18.075 02:52:13 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:18.075 02:52:13 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:18.075 02:52:13 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:18.075 02:52:13 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:18.075 02:52:13 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:18.075 02:52:13 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:18.075 02:52:13 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:18.075 02:52:13 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:18.075 02:52:13 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:18.075 02:52:13 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:18.075 02:52:13 -- setup/hugepages.sh@73 -- # return 0 00:04:18.075 02:52:13 -- setup/hugepages.sh@198 -- # setup output 00:04:18.075 02:52:13 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:18.075 02:52:13 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:20.609 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.609 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:20.873 02:52:15 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:20.873 02:52:15 -- setup/hugepages.sh@89 -- # local node 00:04:20.873 02:52:15 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.873 02:52:15 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.873 02:52:15 -- setup/hugepages.sh@92 -- # local surp 00:04:20.873 02:52:15 -- setup/hugepages.sh@93 -- # local resv 00:04:20.873 02:52:15 -- setup/hugepages.sh@94 -- # local anon 00:04:20.873 02:52:15 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.873 02:52:15 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.873 02:52:15 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.873 02:52:15 -- setup/common.sh@18 -- # local node= 00:04:20.873 02:52:15 -- setup/common.sh@19 -- # local var val 00:04:20.873 02:52:15 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.873 02:52:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.873 02:52:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.873 02:52:15 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.873 02:52:15 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.873 02:52:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.873 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.873 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42939996 kB' 'MemAvailable: 45323236 kB' 'Buffers: 12536 kB' 'Cached: 11212348 kB' 'SwapCached: 16 kB' 'Active: 9458420 kB' 'Inactive: 2354388 kB' 'Active(anon): 8983068 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590632 kB' 'Mapped: 185824 kB' 'Shmem: 8452232 kB' 'KReclaimable: 248208 kB' 'Slab: 782920 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534712 kB' 'KernelStack: 21952 kB' 'PageTables: 8204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10403152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213604 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.874 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.874 02:52:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.874 02:52:15 -- setup/common.sh@33 -- # echo 0 00:04:20.875 02:52:15 -- setup/common.sh@33 -- # return 0 00:04:20.875 02:52:15 -- setup/hugepages.sh@97 -- # anon=0 00:04:20.875 02:52:15 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.875 02:52:15 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.875 02:52:15 -- setup/common.sh@18 -- # local node= 00:04:20.875 02:52:15 -- setup/common.sh@19 -- # local var val 00:04:20.875 02:52:15 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.875 02:52:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.875 02:52:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.875 02:52:15 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.875 02:52:15 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.875 02:52:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42939912 kB' 'MemAvailable: 45323152 kB' 'Buffers: 12536 kB' 'Cached: 11212352 kB' 'SwapCached: 16 kB' 'Active: 9458132 kB' 'Inactive: 2354388 kB' 'Active(anon): 8982780 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590892 kB' 'Mapped: 185700 kB' 'Shmem: 8452236 kB' 'KReclaimable: 248208 kB' 'Slab: 782876 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534668 kB' 'KernelStack: 21936 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10403164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213588 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.875 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.875 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.876 02:52:15 -- setup/common.sh@33 -- # echo 0 00:04:20.876 02:52:15 -- setup/common.sh@33 -- # return 0 00:04:20.876 02:52:15 -- setup/hugepages.sh@99 -- # surp=0 00:04:20.876 02:52:15 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.876 02:52:15 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.876 02:52:15 -- setup/common.sh@18 -- # local node= 00:04:20.876 02:52:15 -- setup/common.sh@19 -- # local var val 00:04:20.876 02:52:15 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.876 02:52:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.876 02:52:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.876 02:52:15 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.876 02:52:15 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.876 02:52:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42939660 kB' 'MemAvailable: 45322900 kB' 'Buffers: 12536 kB' 'Cached: 11212360 kB' 'SwapCached: 16 kB' 'Active: 9458492 kB' 'Inactive: 2354388 kB' 'Active(anon): 8983140 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591256 kB' 'Mapped: 185700 kB' 'Shmem: 8452244 kB' 'KReclaimable: 248208 kB' 'Slab: 782876 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534668 kB' 'KernelStack: 21952 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10403180 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213588 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.876 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.876 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:15 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:15 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.877 02:52:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.877 02:52:16 -- setup/common.sh@33 -- # echo 0 00:04:20.877 02:52:16 -- setup/common.sh@33 -- # return 0 00:04:20.877 02:52:16 -- setup/hugepages.sh@100 -- # resv=0 00:04:20.877 02:52:16 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.877 nr_hugepages=1024 00:04:20.877 02:52:16 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.877 resv_hugepages=0 00:04:20.877 02:52:16 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.877 surplus_hugepages=0 00:04:20.877 02:52:16 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.877 anon_hugepages=0 00:04:20.877 02:52:16 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.877 02:52:16 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.877 02:52:16 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.877 02:52:16 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.877 02:52:16 -- setup/common.sh@18 -- # local node= 00:04:20.877 02:52:16 -- setup/common.sh@19 -- # local var val 00:04:20.877 02:52:16 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.877 02:52:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.877 02:52:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.877 02:52:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.877 02:52:16 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.877 02:52:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.877 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42939688 kB' 'MemAvailable: 45322928 kB' 'Buffers: 12536 kB' 'Cached: 11212360 kB' 'SwapCached: 16 kB' 'Active: 9458160 kB' 'Inactive: 2354388 kB' 'Active(anon): 8982808 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590944 kB' 'Mapped: 185700 kB' 'Shmem: 8452244 kB' 'KReclaimable: 248208 kB' 'Slab: 782876 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534668 kB' 'KernelStack: 21952 kB' 'PageTables: 8200 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10403192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213588 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.878 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.878 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.879 02:52:16 -- setup/common.sh@33 -- # echo 1024 00:04:20.879 02:52:16 -- setup/common.sh@33 -- # return 0 00:04:20.879 02:52:16 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.879 02:52:16 -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.879 02:52:16 -- setup/hugepages.sh@27 -- # local node 00:04:20.879 02:52:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.879 02:52:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:20.879 02:52:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.879 02:52:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:20.879 02:52:16 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.879 02:52:16 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.879 02:52:16 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.879 02:52:16 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.879 02:52:16 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.879 02:52:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.879 02:52:16 -- setup/common.sh@18 -- # local node=0 00:04:20.879 02:52:16 -- setup/common.sh@19 -- # local var val 00:04:20.879 02:52:16 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.879 02:52:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.879 02:52:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.879 02:52:16 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.879 02:52:16 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.879 02:52:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25427492 kB' 'MemUsed: 7164592 kB' 'SwapCached: 16 kB' 'Active: 3439900 kB' 'Inactive: 180704 kB' 'Active(anon): 3223280 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3376640 kB' 'Mapped: 120268 kB' 'AnonPages: 247124 kB' 'Shmem: 2979316 kB' 'KernelStack: 11736 kB' 'PageTables: 4768 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 128024 kB' 'Slab: 373336 kB' 'SReclaimable: 128024 kB' 'SUnreclaim: 245312 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.879 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.879 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # continue 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.880 02:52:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.880 02:52:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.880 02:52:16 -- setup/common.sh@33 -- # echo 0 00:04:20.880 02:52:16 -- setup/common.sh@33 -- # return 0 00:04:20.880 02:52:16 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.880 02:52:16 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.880 02:52:16 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.880 02:52:16 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.880 02:52:16 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:20.880 node0=1024 expecting 1024 00:04:20.880 02:52:16 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:20.880 02:52:16 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:20.880 02:52:16 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:20.880 02:52:16 -- setup/hugepages.sh@202 -- # setup output 00:04:20.880 02:52:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.880 02:52:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:24.174 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:24.174 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:24.174 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:24.174 02:52:19 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:24.174 02:52:19 -- setup/hugepages.sh@89 -- # local node 00:04:24.174 02:52:19 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:24.174 02:52:19 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:24.174 02:52:19 -- setup/hugepages.sh@92 -- # local surp 00:04:24.174 02:52:19 -- setup/hugepages.sh@93 -- # local resv 00:04:24.174 02:52:19 -- setup/hugepages.sh@94 -- # local anon 00:04:24.174 02:52:19 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:24.174 02:52:19 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:24.174 02:52:19 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:24.174 02:52:19 -- setup/common.sh@18 -- # local node= 00:04:24.174 02:52:19 -- setup/common.sh@19 -- # local var val 00:04:24.174 02:52:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.174 02:52:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.174 02:52:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.174 02:52:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.174 02:52:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.174 02:52:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42969688 kB' 'MemAvailable: 45352928 kB' 'Buffers: 12536 kB' 'Cached: 11212452 kB' 'SwapCached: 16 kB' 'Active: 9459452 kB' 'Inactive: 2354388 kB' 'Active(anon): 8984100 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591584 kB' 'Mapped: 185836 kB' 'Shmem: 8452336 kB' 'KReclaimable: 248208 kB' 'Slab: 782644 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534436 kB' 'KernelStack: 21968 kB' 'PageTables: 8212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10403652 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213620 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.174 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.174 02:52:19 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.175 02:52:19 -- setup/common.sh@33 -- # echo 0 00:04:24.175 02:52:19 -- setup/common.sh@33 -- # return 0 00:04:24.175 02:52:19 -- setup/hugepages.sh@97 -- # anon=0 00:04:24.175 02:52:19 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:24.175 02:52:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.175 02:52:19 -- setup/common.sh@18 -- # local node= 00:04:24.175 02:52:19 -- setup/common.sh@19 -- # local var val 00:04:24.175 02:52:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.175 02:52:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.175 02:52:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.175 02:52:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.175 02:52:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.175 02:52:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42970024 kB' 'MemAvailable: 45353264 kB' 'Buffers: 12536 kB' 'Cached: 11212456 kB' 'SwapCached: 16 kB' 'Active: 9459508 kB' 'Inactive: 2354388 kB' 'Active(anon): 8984156 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591656 kB' 'Mapped: 185780 kB' 'Shmem: 8452340 kB' 'KReclaimable: 248208 kB' 'Slab: 782624 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534416 kB' 'KernelStack: 21936 kB' 'PageTables: 8092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10415448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213604 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.175 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.175 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.176 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.176 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.177 02:52:19 -- setup/common.sh@33 -- # echo 0 00:04:24.177 02:52:19 -- setup/common.sh@33 -- # return 0 00:04:24.177 02:52:19 -- setup/hugepages.sh@99 -- # surp=0 00:04:24.177 02:52:19 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:24.177 02:52:19 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:24.177 02:52:19 -- setup/common.sh@18 -- # local node= 00:04:24.177 02:52:19 -- setup/common.sh@19 -- # local var val 00:04:24.177 02:52:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.177 02:52:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.177 02:52:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.177 02:52:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.177 02:52:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.177 02:52:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42970964 kB' 'MemAvailable: 45354204 kB' 'Buffers: 12536 kB' 'Cached: 11212464 kB' 'SwapCached: 16 kB' 'Active: 9458808 kB' 'Inactive: 2354388 kB' 'Active(anon): 8983456 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591460 kB' 'Mapped: 185704 kB' 'Shmem: 8452348 kB' 'KReclaimable: 248208 kB' 'Slab: 782608 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534400 kB' 'KernelStack: 21952 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10403428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213556 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.177 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.177 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.178 02:52:19 -- setup/common.sh@33 -- # echo 0 00:04:24.178 02:52:19 -- setup/common.sh@33 -- # return 0 00:04:24.178 02:52:19 -- setup/hugepages.sh@100 -- # resv=0 00:04:24.178 02:52:19 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:24.178 nr_hugepages=1024 00:04:24.178 02:52:19 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:24.178 resv_hugepages=0 00:04:24.178 02:52:19 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:24.178 surplus_hugepages=0 00:04:24.178 02:52:19 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:24.178 anon_hugepages=0 00:04:24.178 02:52:19 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.178 02:52:19 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:24.178 02:52:19 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:24.178 02:52:19 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:24.178 02:52:19 -- setup/common.sh@18 -- # local node= 00:04:24.178 02:52:19 -- setup/common.sh@19 -- # local var val 00:04:24.178 02:52:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.178 02:52:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.178 02:52:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.178 02:52:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.178 02:52:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.178 02:52:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42971276 kB' 'MemAvailable: 45354516 kB' 'Buffers: 12536 kB' 'Cached: 11212480 kB' 'SwapCached: 16 kB' 'Active: 9458756 kB' 'Inactive: 2354388 kB' 'Active(anon): 8983404 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591404 kB' 'Mapped: 185704 kB' 'Shmem: 8452364 kB' 'KReclaimable: 248208 kB' 'Slab: 782608 kB' 'SReclaimable: 248208 kB' 'SUnreclaim: 534400 kB' 'KernelStack: 21952 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10403324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213556 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.178 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.178 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.179 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.179 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.180 02:52:19 -- setup/common.sh@33 -- # echo 1024 00:04:24.180 02:52:19 -- setup/common.sh@33 -- # return 0 00:04:24.180 02:52:19 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.180 02:52:19 -- setup/hugepages.sh@112 -- # get_nodes 00:04:24.180 02:52:19 -- setup/hugepages.sh@27 -- # local node 00:04:24.180 02:52:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.180 02:52:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:24.180 02:52:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.180 02:52:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:24.180 02:52:19 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:24.180 02:52:19 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:24.180 02:52:19 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.180 02:52:19 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.180 02:52:19 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:24.180 02:52:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.180 02:52:19 -- setup/common.sh@18 -- # local node=0 00:04:24.180 02:52:19 -- setup/common.sh@19 -- # local var val 00:04:24.180 02:52:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.180 02:52:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.180 02:52:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:24.180 02:52:19 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:24.180 02:52:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.180 02:52:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25450324 kB' 'MemUsed: 7141760 kB' 'SwapCached: 16 kB' 'Active: 3439584 kB' 'Inactive: 180704 kB' 'Active(anon): 3222964 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3376716 kB' 'Mapped: 120272 kB' 'AnonPages: 246788 kB' 'Shmem: 2979392 kB' 'KernelStack: 11736 kB' 'PageTables: 4756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 128024 kB' 'Slab: 373136 kB' 'SReclaimable: 128024 kB' 'SUnreclaim: 245112 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.180 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.180 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # continue 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.181 02:52:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.181 02:52:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.181 02:52:19 -- setup/common.sh@33 -- # echo 0 00:04:24.181 02:52:19 -- setup/common.sh@33 -- # return 0 00:04:24.181 02:52:19 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.181 02:52:19 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.181 02:52:19 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.181 02:52:19 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.181 02:52:19 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:24.181 node0=1024 expecting 1024 00:04:24.181 02:52:19 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:24.181 00:04:24.181 real 0m6.129s 00:04:24.181 user 0m2.146s 00:04:24.181 sys 0m3.992s 00:04:24.181 02:52:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:24.181 02:52:19 -- common/autotest_common.sh@10 -- # set +x 00:04:24.181 ************************************ 00:04:24.181 END TEST no_shrink_alloc 00:04:24.181 ************************************ 00:04:24.181 02:52:19 -- setup/hugepages.sh@217 -- # clear_hp 00:04:24.181 02:52:19 -- setup/hugepages.sh@37 -- # local node hp 00:04:24.181 02:52:19 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:24.181 02:52:19 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.181 02:52:19 -- setup/hugepages.sh@41 -- # echo 0 00:04:24.181 02:52:19 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.181 02:52:19 -- setup/hugepages.sh@41 -- # echo 0 00:04:24.181 02:52:19 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:24.181 02:52:19 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.181 02:52:19 -- setup/hugepages.sh@41 -- # echo 0 00:04:24.181 02:52:19 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.181 02:52:19 -- setup/hugepages.sh@41 -- # echo 0 00:04:24.181 02:52:19 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:24.181 02:52:19 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:24.181 00:04:24.181 real 0m25.721s 00:04:24.181 user 0m8.866s 00:04:24.181 sys 0m15.722s 00:04:24.181 02:52:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:24.181 02:52:19 -- common/autotest_common.sh@10 -- # set +x 00:04:24.181 ************************************ 00:04:24.181 END TEST hugepages 00:04:24.181 ************************************ 00:04:24.181 02:52:19 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:24.181 02:52:19 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:24.181 02:52:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:24.181 02:52:19 -- common/autotest_common.sh@10 -- # set +x 00:04:24.181 ************************************ 00:04:24.181 START TEST driver 00:04:24.181 ************************************ 00:04:24.181 02:52:19 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:24.181 * Looking for test storage... 00:04:24.440 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:24.440 02:52:19 -- setup/driver.sh@68 -- # setup reset 00:04:24.440 02:52:19 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:24.440 02:52:19 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:28.683 02:52:23 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:28.683 02:52:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:28.683 02:52:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:28.683 02:52:23 -- common/autotest_common.sh@10 -- # set +x 00:04:28.683 ************************************ 00:04:28.683 START TEST guess_driver 00:04:28.683 ************************************ 00:04:28.683 02:52:23 -- common/autotest_common.sh@1104 -- # guess_driver 00:04:28.683 02:52:23 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:28.683 02:52:23 -- setup/driver.sh@47 -- # local fail=0 00:04:28.683 02:52:23 -- setup/driver.sh@49 -- # pick_driver 00:04:28.683 02:52:23 -- setup/driver.sh@36 -- # vfio 00:04:28.683 02:52:23 -- setup/driver.sh@21 -- # local iommu_grups 00:04:28.683 02:52:23 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:28.683 02:52:23 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:28.683 02:52:23 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:28.683 02:52:23 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:28.683 02:52:23 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:28.683 02:52:23 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:28.683 02:52:23 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:28.683 02:52:23 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:28.683 02:52:23 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:28.683 02:52:23 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:28.683 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:28.683 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:28.683 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:28.683 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:28.683 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:28.683 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:28.683 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:28.683 02:52:23 -- setup/driver.sh@30 -- # return 0 00:04:28.683 02:52:23 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:28.683 02:52:23 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:28.683 02:52:23 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:28.683 02:52:23 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:28.683 Looking for driver=vfio-pci 00:04:28.683 02:52:23 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:28.683 02:52:23 -- setup/driver.sh@45 -- # setup output config 00:04:28.683 02:52:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:28.683 02:52:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.876 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.876 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.876 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.877 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.877 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.877 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.877 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.877 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.877 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.877 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.877 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.877 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.877 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.877 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.877 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.877 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.877 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.877 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.877 02:52:27 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:32.877 02:52:27 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:32.877 02:52:27 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.820 02:52:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.821 02:52:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.821 02:52:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:34.085 02:52:29 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:34.085 02:52:29 -- setup/driver.sh@65 -- # setup reset 00:04:34.085 02:52:29 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:34.085 02:52:29 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:39.358 00:04:39.358 real 0m9.754s 00:04:39.358 user 0m2.484s 00:04:39.358 sys 0m4.982s 00:04:39.358 02:52:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:39.358 02:52:33 -- common/autotest_common.sh@10 -- # set +x 00:04:39.358 ************************************ 00:04:39.358 END TEST guess_driver 00:04:39.358 ************************************ 00:04:39.358 00:04:39.358 real 0m14.291s 00:04:39.358 user 0m3.749s 00:04:39.358 sys 0m7.428s 00:04:39.358 02:52:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:39.358 02:52:33 -- common/autotest_common.sh@10 -- # set +x 00:04:39.358 ************************************ 00:04:39.358 END TEST driver 00:04:39.358 ************************************ 00:04:39.358 02:52:33 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:39.358 02:52:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:39.358 02:52:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:39.358 02:52:33 -- common/autotest_common.sh@10 -- # set +x 00:04:39.358 ************************************ 00:04:39.358 START TEST devices 00:04:39.358 ************************************ 00:04:39.358 02:52:33 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:39.358 * Looking for test storage... 00:04:39.358 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:39.358 02:52:33 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:39.358 02:52:33 -- setup/devices.sh@192 -- # setup reset 00:04:39.358 02:52:33 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:39.358 02:52:33 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:41.891 02:52:36 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:41.891 02:52:36 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:41.891 02:52:36 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:41.891 02:52:36 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:41.891 02:52:36 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:41.891 02:52:36 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:41.891 02:52:36 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:41.891 02:52:36 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:41.891 02:52:36 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:41.891 02:52:36 -- setup/devices.sh@196 -- # blocks=() 00:04:41.891 02:52:36 -- setup/devices.sh@196 -- # declare -a blocks 00:04:41.891 02:52:36 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:41.891 02:52:36 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:41.891 02:52:36 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:41.891 02:52:36 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:41.891 02:52:36 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:41.891 02:52:36 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:41.891 02:52:36 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:41.891 02:52:36 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:41.891 02:52:36 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:41.891 02:52:36 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:41.891 02:52:36 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:41.891 No valid GPT data, bailing 00:04:41.891 02:52:37 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:41.892 02:52:37 -- scripts/common.sh@393 -- # pt= 00:04:41.892 02:52:37 -- scripts/common.sh@394 -- # return 1 00:04:41.892 02:52:37 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:41.892 02:52:37 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:41.892 02:52:37 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:41.892 02:52:37 -- setup/common.sh@80 -- # echo 1600321314816 00:04:41.892 02:52:37 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:41.892 02:52:37 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:41.892 02:52:37 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:41.892 02:52:37 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:41.892 02:52:37 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:41.892 02:52:37 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:41.892 02:52:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:41.892 02:52:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:41.892 02:52:37 -- common/autotest_common.sh@10 -- # set +x 00:04:41.892 ************************************ 00:04:41.892 START TEST nvme_mount 00:04:41.892 ************************************ 00:04:41.892 02:52:37 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:41.892 02:52:37 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:41.892 02:52:37 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:41.892 02:52:37 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.892 02:52:37 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:41.892 02:52:37 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:41.892 02:52:37 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:41.892 02:52:37 -- setup/common.sh@40 -- # local part_no=1 00:04:41.892 02:52:37 -- setup/common.sh@41 -- # local size=1073741824 00:04:41.892 02:52:37 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:41.892 02:52:37 -- setup/common.sh@44 -- # parts=() 00:04:41.892 02:52:37 -- setup/common.sh@44 -- # local parts 00:04:41.892 02:52:37 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:41.892 02:52:37 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:41.892 02:52:37 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:41.892 02:52:37 -- setup/common.sh@46 -- # (( part++ )) 00:04:41.892 02:52:37 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:41.892 02:52:37 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:41.892 02:52:37 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:41.892 02:52:37 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:42.828 Creating new GPT entries in memory. 00:04:42.828 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:42.828 other utilities. 00:04:42.828 02:52:38 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:42.828 02:52:38 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:42.828 02:52:38 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:42.828 02:52:38 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:42.828 02:52:38 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:44.211 Creating new GPT entries in memory. 00:04:44.211 The operation has completed successfully. 00:04:44.211 02:52:39 -- setup/common.sh@57 -- # (( part++ )) 00:04:44.211 02:52:39 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:44.211 02:52:39 -- setup/common.sh@62 -- # wait 627350 00:04:44.211 02:52:39 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:44.211 02:52:39 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:44.211 02:52:39 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:44.211 02:52:39 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:44.211 02:52:39 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:44.211 02:52:39 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:44.211 02:52:39 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:44.211 02:52:39 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:44.211 02:52:39 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:44.211 02:52:39 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:44.211 02:52:39 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:44.211 02:52:39 -- setup/devices.sh@53 -- # local found=0 00:04:44.211 02:52:39 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:44.211 02:52:39 -- setup/devices.sh@56 -- # : 00:04:44.211 02:52:39 -- setup/devices.sh@59 -- # local pci status 00:04:44.211 02:52:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:44.211 02:52:39 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:44.211 02:52:39 -- setup/devices.sh@47 -- # setup output config 00:04:44.211 02:52:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:44.211 02:52:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:47.502 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.502 02:52:42 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:47.502 02:52:42 -- setup/devices.sh@63 -- # found=1 00:04:47.502 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.502 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.502 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.502 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.502 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.502 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.502 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.502 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.502 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.502 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.502 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.502 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.503 02:52:42 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:47.503 02:52:42 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:47.503 02:52:42 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.503 02:52:42 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:47.503 02:52:42 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:47.503 02:52:42 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:47.503 02:52:42 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.503 02:52:42 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.503 02:52:42 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:47.503 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:47.503 02:52:42 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:47.503 02:52:42 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:47.762 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:47.762 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:47.762 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:47.762 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:47.762 02:52:42 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:47.762 02:52:42 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:47.762 02:52:42 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.762 02:52:42 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:47.762 02:52:42 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:47.762 02:52:42 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.762 02:52:42 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:47.762 02:52:42 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:47.762 02:52:42 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:47.762 02:52:42 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.762 02:52:42 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:47.762 02:52:42 -- setup/devices.sh@53 -- # local found=0 00:04:47.762 02:52:42 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:47.762 02:52:42 -- setup/devices.sh@56 -- # : 00:04:47.762 02:52:42 -- setup/devices.sh@59 -- # local pci status 00:04:47.762 02:52:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.762 02:52:42 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:47.762 02:52:42 -- setup/devices.sh@47 -- # setup output config 00:04:47.762 02:52:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.762 02:52:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:50.295 02:52:45 -- setup/devices.sh@63 -- # found=1 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.295 02:52:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.295 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.554 02:52:45 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:50.554 02:52:45 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:50.554 02:52:45 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.554 02:52:45 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:50.554 02:52:45 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:50.554 02:52:45 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.554 02:52:45 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:50.554 02:52:45 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:50.554 02:52:45 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:50.554 02:52:45 -- setup/devices.sh@50 -- # local mount_point= 00:04:50.554 02:52:45 -- setup/devices.sh@51 -- # local test_file= 00:04:50.554 02:52:45 -- setup/devices.sh@53 -- # local found=0 00:04:50.554 02:52:45 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:50.554 02:52:45 -- setup/devices.sh@59 -- # local pci status 00:04:50.554 02:52:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.554 02:52:45 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:50.554 02:52:45 -- setup/devices.sh@47 -- # setup output config 00:04:50.554 02:52:45 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:50.554 02:52:45 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:53.846 02:52:48 -- setup/devices.sh@63 -- # found=1 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.846 02:52:48 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:53.846 02:52:48 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:53.846 02:52:48 -- setup/devices.sh@68 -- # return 0 00:04:53.846 02:52:48 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:53.846 02:52:48 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.846 02:52:48 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:53.846 02:52:48 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:53.846 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:53.846 00:04:53.846 real 0m11.751s 00:04:53.846 user 0m3.148s 00:04:53.846 sys 0m6.278s 00:04:53.846 02:52:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:53.846 02:52:48 -- common/autotest_common.sh@10 -- # set +x 00:04:53.846 ************************************ 00:04:53.846 END TEST nvme_mount 00:04:53.846 ************************************ 00:04:53.846 02:52:48 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:53.846 02:52:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:53.846 02:52:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:53.846 02:52:48 -- common/autotest_common.sh@10 -- # set +x 00:04:53.846 ************************************ 00:04:53.846 START TEST dm_mount 00:04:53.846 ************************************ 00:04:53.846 02:52:48 -- common/autotest_common.sh@1104 -- # dm_mount 00:04:53.846 02:52:48 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:53.846 02:52:48 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:53.846 02:52:48 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:53.846 02:52:48 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:53.846 02:52:48 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:53.846 02:52:48 -- setup/common.sh@40 -- # local part_no=2 00:04:53.846 02:52:48 -- setup/common.sh@41 -- # local size=1073741824 00:04:53.846 02:52:48 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:53.846 02:52:48 -- setup/common.sh@44 -- # parts=() 00:04:53.846 02:52:48 -- setup/common.sh@44 -- # local parts 00:04:53.846 02:52:48 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:53.846 02:52:48 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.846 02:52:48 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.846 02:52:48 -- setup/common.sh@46 -- # (( part++ )) 00:04:53.846 02:52:48 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.846 02:52:48 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.846 02:52:48 -- setup/common.sh@46 -- # (( part++ )) 00:04:53.846 02:52:48 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.846 02:52:48 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:53.846 02:52:48 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:53.846 02:52:48 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:54.785 Creating new GPT entries in memory. 00:04:54.785 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:54.785 other utilities. 00:04:54.785 02:52:49 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:54.785 02:52:49 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:54.785 02:52:49 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:54.785 02:52:49 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:54.785 02:52:49 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:55.724 Creating new GPT entries in memory. 00:04:55.724 The operation has completed successfully. 00:04:55.724 02:52:50 -- setup/common.sh@57 -- # (( part++ )) 00:04:55.724 02:52:50 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:55.724 02:52:50 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:55.724 02:52:50 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:55.724 02:52:50 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:56.662 The operation has completed successfully. 00:04:56.662 02:52:51 -- setup/common.sh@57 -- # (( part++ )) 00:04:56.662 02:52:51 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:56.662 02:52:51 -- setup/common.sh@62 -- # wait 631672 00:04:56.922 02:52:51 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:56.922 02:52:51 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.922 02:52:51 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.922 02:52:51 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:56.922 02:52:51 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:56.922 02:52:51 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.922 02:52:51 -- setup/devices.sh@161 -- # break 00:04:56.922 02:52:51 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.922 02:52:51 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:56.922 02:52:51 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:56.922 02:52:51 -- setup/devices.sh@166 -- # dm=dm-0 00:04:56.922 02:52:51 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:56.922 02:52:51 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:56.922 02:52:51 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.922 02:52:51 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:56.922 02:52:51 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.922 02:52:51 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.922 02:52:51 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:56.922 02:52:51 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.922 02:52:52 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.922 02:52:52 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:56.922 02:52:52 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:56.922 02:52:52 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.922 02:52:52 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.922 02:52:52 -- setup/devices.sh@53 -- # local found=0 00:04:56.922 02:52:52 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:56.922 02:52:52 -- setup/devices.sh@56 -- # : 00:04:56.922 02:52:52 -- setup/devices.sh@59 -- # local pci status 00:04:56.922 02:52:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.922 02:52:52 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:56.922 02:52:52 -- setup/devices.sh@47 -- # setup output config 00:04:56.922 02:52:52 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.922 02:52:52 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:00.210 02:52:55 -- setup/devices.sh@63 -- # found=1 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:00.210 02:52:55 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:00.210 02:52:55 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:00.210 02:52:55 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:00.210 02:52:55 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:00.210 02:52:55 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:00.210 02:52:55 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:00.210 02:52:55 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:00.210 02:52:55 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:00.210 02:52:55 -- setup/devices.sh@50 -- # local mount_point= 00:05:00.210 02:52:55 -- setup/devices.sh@51 -- # local test_file= 00:05:00.210 02:52:55 -- setup/devices.sh@53 -- # local found=0 00:05:00.210 02:52:55 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:00.210 02:52:55 -- setup/devices.sh@59 -- # local pci status 00:05:00.210 02:52:55 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.210 02:52:55 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:00.210 02:52:55 -- setup/devices.sh@47 -- # setup output config 00:05:00.210 02:52:55 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:00.210 02:52:55 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:03.502 02:52:58 -- setup/devices.sh@63 -- # found=1 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.502 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.502 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.503 02:52:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.503 02:52:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.503 02:52:58 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:03.503 02:52:58 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:03.503 02:52:58 -- setup/devices.sh@68 -- # return 0 00:05:03.503 02:52:58 -- setup/devices.sh@187 -- # cleanup_dm 00:05:03.503 02:52:58 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:03.503 02:52:58 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:03.503 02:52:58 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:03.503 02:52:58 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.503 02:52:58 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:03.503 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:03.503 02:52:58 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:03.503 02:52:58 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:03.503 00:05:03.503 real 0m9.809s 00:05:03.503 user 0m2.486s 00:05:03.503 sys 0m4.420s 00:05:03.503 02:52:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.503 02:52:58 -- common/autotest_common.sh@10 -- # set +x 00:05:03.503 ************************************ 00:05:03.503 END TEST dm_mount 00:05:03.503 ************************************ 00:05:03.503 02:52:58 -- setup/devices.sh@1 -- # cleanup 00:05:03.503 02:52:58 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:03.503 02:52:58 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.503 02:52:58 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.503 02:52:58 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:03.503 02:52:58 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:03.503 02:52:58 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:03.763 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:03.763 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:03.763 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:03.763 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:03.763 02:52:58 -- setup/devices.sh@12 -- # cleanup_dm 00:05:03.763 02:52:58 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:03.763 02:52:58 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:03.763 02:52:58 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.763 02:52:58 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:03.763 02:52:58 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:03.763 02:52:58 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:03.763 00:05:03.763 real 0m25.290s 00:05:03.763 user 0m6.798s 00:05:03.763 sys 0m13.107s 00:05:03.763 02:52:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.763 02:52:58 -- common/autotest_common.sh@10 -- # set +x 00:05:03.763 ************************************ 00:05:03.763 END TEST devices 00:05:03.763 ************************************ 00:05:03.763 00:05:03.763 real 1m29.497s 00:05:03.763 user 0m27.245s 00:05:03.763 sys 0m50.950s 00:05:03.763 02:52:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.763 02:52:58 -- common/autotest_common.sh@10 -- # set +x 00:05:03.763 ************************************ 00:05:03.763 END TEST setup.sh 00:05:03.763 ************************************ 00:05:04.022 02:52:59 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:06.560 Hugepages 00:05:06.560 node hugesize free / total 00:05:06.560 node0 1048576kB 0 / 0 00:05:06.560 node0 2048kB 2048 / 2048 00:05:06.560 node1 1048576kB 0 / 0 00:05:06.560 node1 2048kB 0 / 0 00:05:06.560 00:05:06.560 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:06.560 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:06.560 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:06.560 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:06.560 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:06.560 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:06.560 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:06.560 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:06.560 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:06.560 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:06.560 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:06.560 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:06.560 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:06.560 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:06.560 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:06.560 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:06.560 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:06.820 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:06.820 02:53:01 -- spdk/autotest.sh@141 -- # uname -s 00:05:06.820 02:53:01 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:05:06.820 02:53:01 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:05:06.820 02:53:01 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:10.181 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:10.181 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:11.558 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:11.816 02:53:06 -- common/autotest_common.sh@1517 -- # sleep 1 00:05:12.763 02:53:07 -- common/autotest_common.sh@1518 -- # bdfs=() 00:05:12.763 02:53:07 -- common/autotest_common.sh@1518 -- # local bdfs 00:05:12.763 02:53:07 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:05:12.763 02:53:07 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:05:12.763 02:53:07 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:12.763 02:53:07 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:12.763 02:53:07 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:12.763 02:53:07 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:12.763 02:53:07 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:12.763 02:53:07 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:12.763 02:53:07 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:12.763 02:53:07 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:16.053 Waiting for block devices as requested 00:05:16.053 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:16.053 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:16.313 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:16.313 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:16.313 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:16.313 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:16.572 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:16.572 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:16.572 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:16.830 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:16.831 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:16.831 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:17.089 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:17.089 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:17.089 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:17.348 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:17.348 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:17.606 02:53:12 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:05:17.606 02:53:12 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:17.606 02:53:12 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:05:17.606 02:53:12 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:05:17.606 02:53:12 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:17.606 02:53:12 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:17.606 02:53:12 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:17.606 02:53:12 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:05:17.606 02:53:12 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:05:17.606 02:53:12 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:05:17.606 02:53:12 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:05:17.606 02:53:12 -- common/autotest_common.sh@1530 -- # grep oacs 00:05:17.606 02:53:12 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:05:17.606 02:53:12 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:05:17.606 02:53:12 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:05:17.606 02:53:12 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:05:17.606 02:53:12 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:05:17.606 02:53:12 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:05:17.606 02:53:12 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:05:17.606 02:53:12 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:05:17.606 02:53:12 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:05:17.606 02:53:12 -- common/autotest_common.sh@1542 -- # continue 00:05:17.607 02:53:12 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:05:17.607 02:53:12 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:17.607 02:53:12 -- common/autotest_common.sh@10 -- # set +x 00:05:17.607 02:53:12 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:05:17.607 02:53:12 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:17.607 02:53:12 -- common/autotest_common.sh@10 -- # set +x 00:05:17.607 02:53:12 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:20.894 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:20.894 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:21.153 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:21.153 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:21.153 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:21.153 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:23.059 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:23.059 02:53:17 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:05:23.059 02:53:17 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:23.059 02:53:17 -- common/autotest_common.sh@10 -- # set +x 00:05:23.059 02:53:17 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:05:23.059 02:53:17 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:05:23.059 02:53:17 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:05:23.059 02:53:17 -- common/autotest_common.sh@1562 -- # bdfs=() 00:05:23.059 02:53:17 -- common/autotest_common.sh@1562 -- # local bdfs 00:05:23.059 02:53:17 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:05:23.059 02:53:17 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:23.059 02:53:17 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:23.059 02:53:17 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:23.059 02:53:17 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:23.059 02:53:17 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:23.059 02:53:18 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:23.059 02:53:18 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:23.059 02:53:18 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:05:23.059 02:53:18 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:23.059 02:53:18 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:05:23.059 02:53:18 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:23.059 02:53:18 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:05:23.059 02:53:18 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:05:23.059 02:53:18 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:05:23.059 02:53:18 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=641596 00:05:23.059 02:53:18 -- common/autotest_common.sh@1583 -- # waitforlisten 641596 00:05:23.059 02:53:18 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:23.059 02:53:18 -- common/autotest_common.sh@819 -- # '[' -z 641596 ']' 00:05:23.059 02:53:18 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:23.059 02:53:18 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:23.059 02:53:18 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:23.059 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:23.059 02:53:18 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:23.059 02:53:18 -- common/autotest_common.sh@10 -- # set +x 00:05:23.059 [2024-07-14 02:53:18.078211] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:23.059 [2024-07-14 02:53:18.078302] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid641596 ] 00:05:23.059 EAL: No free 2048 kB hugepages reported on node 1 00:05:23.059 [2024-07-14 02:53:18.148746] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:23.059 [2024-07-14 02:53:18.190345] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:23.059 [2024-07-14 02:53:18.190479] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.995 02:53:18 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:23.995 02:53:18 -- common/autotest_common.sh@852 -- # return 0 00:05:23.995 02:53:18 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:05:23.995 02:53:18 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:05:23.995 02:53:18 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:27.290 nvme0n1 00:05:27.290 02:53:21 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:27.290 [2024-07-14 02:53:22.036184] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:27.290 request: 00:05:27.290 { 00:05:27.290 "nvme_ctrlr_name": "nvme0", 00:05:27.290 "password": "test", 00:05:27.290 "method": "bdev_nvme_opal_revert", 00:05:27.290 "req_id": 1 00:05:27.290 } 00:05:27.290 Got JSON-RPC error response 00:05:27.290 response: 00:05:27.290 { 00:05:27.290 "code": -32602, 00:05:27.290 "message": "Invalid parameters" 00:05:27.290 } 00:05:27.290 02:53:22 -- common/autotest_common.sh@1589 -- # true 00:05:27.290 02:53:22 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:05:27.290 02:53:22 -- common/autotest_common.sh@1593 -- # killprocess 641596 00:05:27.290 02:53:22 -- common/autotest_common.sh@926 -- # '[' -z 641596 ']' 00:05:27.290 02:53:22 -- common/autotest_common.sh@930 -- # kill -0 641596 00:05:27.290 02:53:22 -- common/autotest_common.sh@931 -- # uname 00:05:27.290 02:53:22 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:27.290 02:53:22 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 641596 00:05:27.290 02:53:22 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:27.290 02:53:22 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:27.290 02:53:22 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 641596' 00:05:27.290 killing process with pid 641596 00:05:27.290 02:53:22 -- common/autotest_common.sh@945 -- # kill 641596 00:05:27.290 02:53:22 -- common/autotest_common.sh@950 -- # wait 641596 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.290 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:27.291 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.196 02:53:24 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:05:29.196 02:53:24 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:05:29.196 02:53:24 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:29.196 02:53:24 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:29.196 02:53:24 -- spdk/autotest.sh@173 -- # timing_enter lib 00:05:29.196 02:53:24 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:29.196 02:53:24 -- common/autotest_common.sh@10 -- # set +x 00:05:29.196 02:53:24 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:29.196 02:53:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.196 02:53:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.196 02:53:24 -- common/autotest_common.sh@10 -- # set +x 00:05:29.196 ************************************ 00:05:29.196 START TEST env 00:05:29.196 ************************************ 00:05:29.196 02:53:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:29.196 * Looking for test storage... 00:05:29.196 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:29.196 02:53:24 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:29.196 02:53:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.196 02:53:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.196 02:53:24 -- common/autotest_common.sh@10 -- # set +x 00:05:29.196 ************************************ 00:05:29.196 START TEST env_memory 00:05:29.196 ************************************ 00:05:29.196 02:53:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:29.196 00:05:29.196 00:05:29.196 CUnit - A unit testing framework for C - Version 2.1-3 00:05:29.196 http://cunit.sourceforge.net/ 00:05:29.196 00:05:29.196 00:05:29.196 Suite: memory 00:05:29.456 Test: alloc and free memory map ...[2024-07-14 02:53:24.464022] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:29.456 passed 00:05:29.456 Test: mem map translation ...[2024-07-14 02:53:24.477714] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:29.456 [2024-07-14 02:53:24.477731] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:29.456 [2024-07-14 02:53:24.477762] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:29.456 [2024-07-14 02:53:24.477771] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:29.456 passed 00:05:29.456 Test: mem map registration ...[2024-07-14 02:53:24.499678] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:29.456 [2024-07-14 02:53:24.499697] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:29.456 passed 00:05:29.457 Test: mem map adjacent registrations ...passed 00:05:29.457 00:05:29.457 Run Summary: Type Total Ran Passed Failed Inactive 00:05:29.457 suites 1 1 n/a 0 0 00:05:29.457 tests 4 4 4 0 0 00:05:29.457 asserts 152 152 152 0 n/a 00:05:29.457 00:05:29.457 Elapsed time = 0.089 seconds 00:05:29.457 00:05:29.457 real 0m0.102s 00:05:29.457 user 0m0.091s 00:05:29.457 sys 0m0.011s 00:05:29.457 02:53:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.457 02:53:24 -- common/autotest_common.sh@10 -- # set +x 00:05:29.457 ************************************ 00:05:29.457 END TEST env_memory 00:05:29.457 ************************************ 00:05:29.457 02:53:24 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:29.457 02:53:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.457 02:53:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.457 02:53:24 -- common/autotest_common.sh@10 -- # set +x 00:05:29.457 ************************************ 00:05:29.457 START TEST env_vtophys 00:05:29.457 ************************************ 00:05:29.457 02:53:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:29.457 EAL: lib.eal log level changed from notice to debug 00:05:29.457 EAL: Detected lcore 0 as core 0 on socket 0 00:05:29.457 EAL: Detected lcore 1 as core 1 on socket 0 00:05:29.457 EAL: Detected lcore 2 as core 2 on socket 0 00:05:29.457 EAL: Detected lcore 3 as core 3 on socket 0 00:05:29.457 EAL: Detected lcore 4 as core 4 on socket 0 00:05:29.457 EAL: Detected lcore 5 as core 5 on socket 0 00:05:29.457 EAL: Detected lcore 6 as core 6 on socket 0 00:05:29.457 EAL: Detected lcore 7 as core 8 on socket 0 00:05:29.457 EAL: Detected lcore 8 as core 9 on socket 0 00:05:29.457 EAL: Detected lcore 9 as core 10 on socket 0 00:05:29.457 EAL: Detected lcore 10 as core 11 on socket 0 00:05:29.457 EAL: Detected lcore 11 as core 12 on socket 0 00:05:29.457 EAL: Detected lcore 12 as core 13 on socket 0 00:05:29.457 EAL: Detected lcore 13 as core 14 on socket 0 00:05:29.457 EAL: Detected lcore 14 as core 16 on socket 0 00:05:29.457 EAL: Detected lcore 15 as core 17 on socket 0 00:05:29.457 EAL: Detected lcore 16 as core 18 on socket 0 00:05:29.457 EAL: Detected lcore 17 as core 19 on socket 0 00:05:29.457 EAL: Detected lcore 18 as core 20 on socket 0 00:05:29.457 EAL: Detected lcore 19 as core 21 on socket 0 00:05:29.457 EAL: Detected lcore 20 as core 22 on socket 0 00:05:29.457 EAL: Detected lcore 21 as core 24 on socket 0 00:05:29.457 EAL: Detected lcore 22 as core 25 on socket 0 00:05:29.457 EAL: Detected lcore 23 as core 26 on socket 0 00:05:29.457 EAL: Detected lcore 24 as core 27 on socket 0 00:05:29.457 EAL: Detected lcore 25 as core 28 on socket 0 00:05:29.457 EAL: Detected lcore 26 as core 29 on socket 0 00:05:29.457 EAL: Detected lcore 27 as core 30 on socket 0 00:05:29.457 EAL: Detected lcore 28 as core 0 on socket 1 00:05:29.457 EAL: Detected lcore 29 as core 1 on socket 1 00:05:29.457 EAL: Detected lcore 30 as core 2 on socket 1 00:05:29.457 EAL: Detected lcore 31 as core 3 on socket 1 00:05:29.457 EAL: Detected lcore 32 as core 4 on socket 1 00:05:29.457 EAL: Detected lcore 33 as core 5 on socket 1 00:05:29.457 EAL: Detected lcore 34 as core 6 on socket 1 00:05:29.457 EAL: Detected lcore 35 as core 8 on socket 1 00:05:29.457 EAL: Detected lcore 36 as core 9 on socket 1 00:05:29.457 EAL: Detected lcore 37 as core 10 on socket 1 00:05:29.457 EAL: Detected lcore 38 as core 11 on socket 1 00:05:29.457 EAL: Detected lcore 39 as core 12 on socket 1 00:05:29.457 EAL: Detected lcore 40 as core 13 on socket 1 00:05:29.457 EAL: Detected lcore 41 as core 14 on socket 1 00:05:29.457 EAL: Detected lcore 42 as core 16 on socket 1 00:05:29.457 EAL: Detected lcore 43 as core 17 on socket 1 00:05:29.457 EAL: Detected lcore 44 as core 18 on socket 1 00:05:29.457 EAL: Detected lcore 45 as core 19 on socket 1 00:05:29.457 EAL: Detected lcore 46 as core 20 on socket 1 00:05:29.457 EAL: Detected lcore 47 as core 21 on socket 1 00:05:29.457 EAL: Detected lcore 48 as core 22 on socket 1 00:05:29.457 EAL: Detected lcore 49 as core 24 on socket 1 00:05:29.457 EAL: Detected lcore 50 as core 25 on socket 1 00:05:29.457 EAL: Detected lcore 51 as core 26 on socket 1 00:05:29.457 EAL: Detected lcore 52 as core 27 on socket 1 00:05:29.457 EAL: Detected lcore 53 as core 28 on socket 1 00:05:29.457 EAL: Detected lcore 54 as core 29 on socket 1 00:05:29.457 EAL: Detected lcore 55 as core 30 on socket 1 00:05:29.457 EAL: Detected lcore 56 as core 0 on socket 0 00:05:29.457 EAL: Detected lcore 57 as core 1 on socket 0 00:05:29.457 EAL: Detected lcore 58 as core 2 on socket 0 00:05:29.457 EAL: Detected lcore 59 as core 3 on socket 0 00:05:29.457 EAL: Detected lcore 60 as core 4 on socket 0 00:05:29.457 EAL: Detected lcore 61 as core 5 on socket 0 00:05:29.457 EAL: Detected lcore 62 as core 6 on socket 0 00:05:29.457 EAL: Detected lcore 63 as core 8 on socket 0 00:05:29.457 EAL: Detected lcore 64 as core 9 on socket 0 00:05:29.457 EAL: Detected lcore 65 as core 10 on socket 0 00:05:29.457 EAL: Detected lcore 66 as core 11 on socket 0 00:05:29.457 EAL: Detected lcore 67 as core 12 on socket 0 00:05:29.457 EAL: Detected lcore 68 as core 13 on socket 0 00:05:29.457 EAL: Detected lcore 69 as core 14 on socket 0 00:05:29.457 EAL: Detected lcore 70 as core 16 on socket 0 00:05:29.457 EAL: Detected lcore 71 as core 17 on socket 0 00:05:29.457 EAL: Detected lcore 72 as core 18 on socket 0 00:05:29.457 EAL: Detected lcore 73 as core 19 on socket 0 00:05:29.457 EAL: Detected lcore 74 as core 20 on socket 0 00:05:29.457 EAL: Detected lcore 75 as core 21 on socket 0 00:05:29.457 EAL: Detected lcore 76 as core 22 on socket 0 00:05:29.457 EAL: Detected lcore 77 as core 24 on socket 0 00:05:29.457 EAL: Detected lcore 78 as core 25 on socket 0 00:05:29.457 EAL: Detected lcore 79 as core 26 on socket 0 00:05:29.457 EAL: Detected lcore 80 as core 27 on socket 0 00:05:29.457 EAL: Detected lcore 81 as core 28 on socket 0 00:05:29.457 EAL: Detected lcore 82 as core 29 on socket 0 00:05:29.457 EAL: Detected lcore 83 as core 30 on socket 0 00:05:29.457 EAL: Detected lcore 84 as core 0 on socket 1 00:05:29.457 EAL: Detected lcore 85 as core 1 on socket 1 00:05:29.457 EAL: Detected lcore 86 as core 2 on socket 1 00:05:29.457 EAL: Detected lcore 87 as core 3 on socket 1 00:05:29.457 EAL: Detected lcore 88 as core 4 on socket 1 00:05:29.457 EAL: Detected lcore 89 as core 5 on socket 1 00:05:29.457 EAL: Detected lcore 90 as core 6 on socket 1 00:05:29.457 EAL: Detected lcore 91 as core 8 on socket 1 00:05:29.457 EAL: Detected lcore 92 as core 9 on socket 1 00:05:29.457 EAL: Detected lcore 93 as core 10 on socket 1 00:05:29.457 EAL: Detected lcore 94 as core 11 on socket 1 00:05:29.457 EAL: Detected lcore 95 as core 12 on socket 1 00:05:29.457 EAL: Detected lcore 96 as core 13 on socket 1 00:05:29.457 EAL: Detected lcore 97 as core 14 on socket 1 00:05:29.457 EAL: Detected lcore 98 as core 16 on socket 1 00:05:29.457 EAL: Detected lcore 99 as core 17 on socket 1 00:05:29.457 EAL: Detected lcore 100 as core 18 on socket 1 00:05:29.457 EAL: Detected lcore 101 as core 19 on socket 1 00:05:29.457 EAL: Detected lcore 102 as core 20 on socket 1 00:05:29.457 EAL: Detected lcore 103 as core 21 on socket 1 00:05:29.457 EAL: Detected lcore 104 as core 22 on socket 1 00:05:29.457 EAL: Detected lcore 105 as core 24 on socket 1 00:05:29.457 EAL: Detected lcore 106 as core 25 on socket 1 00:05:29.457 EAL: Detected lcore 107 as core 26 on socket 1 00:05:29.457 EAL: Detected lcore 108 as core 27 on socket 1 00:05:29.457 EAL: Detected lcore 109 as core 28 on socket 1 00:05:29.457 EAL: Detected lcore 110 as core 29 on socket 1 00:05:29.457 EAL: Detected lcore 111 as core 30 on socket 1 00:05:29.457 EAL: Maximum logical cores by configuration: 128 00:05:29.457 EAL: Detected CPU lcores: 112 00:05:29.457 EAL: Detected NUMA nodes: 2 00:05:29.457 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:05:29.457 EAL: Checking presence of .so 'librte_eal.so.23' 00:05:29.457 EAL: Checking presence of .so 'librte_eal.so' 00:05:29.457 EAL: Detected static linkage of DPDK 00:05:29.457 EAL: No shared files mode enabled, IPC will be disabled 00:05:29.457 EAL: Bus pci wants IOVA as 'DC' 00:05:29.457 EAL: Buses did not request a specific IOVA mode. 00:05:29.457 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:29.457 EAL: Selected IOVA mode 'VA' 00:05:29.457 EAL: No free 2048 kB hugepages reported on node 1 00:05:29.457 EAL: Probing VFIO support... 00:05:29.457 EAL: IOMMU type 1 (Type 1) is supported 00:05:29.457 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:29.457 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:29.457 EAL: VFIO support initialized 00:05:29.457 EAL: Ask a virtual area of 0x2e000 bytes 00:05:29.457 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:29.457 EAL: Setting up physically contiguous memory... 00:05:29.457 EAL: Setting maximum number of open files to 524288 00:05:29.457 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:29.457 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:29.457 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:29.457 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.457 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:29.457 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:29.457 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.457 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:29.457 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:29.457 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.457 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:29.457 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:29.457 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.457 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:29.457 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:29.457 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.457 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:29.457 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:29.457 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.457 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:29.457 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:29.457 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.457 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:29.457 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:29.457 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.457 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:29.457 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:29.457 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:29.457 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.457 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:29.457 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:29.457 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.457 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:29.457 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:29.458 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.458 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:29.458 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:29.458 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.458 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:29.458 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:29.458 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.458 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:29.458 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:29.458 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.458 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:29.458 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:29.458 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.458 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:29.458 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:29.458 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.458 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:29.458 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:29.458 EAL: Hugepages will be freed exactly as allocated. 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: TSC frequency is ~2500000 KHz 00:05:29.458 EAL: Main lcore 0 is ready (tid=7f0af33c4a00;cpuset=[0]) 00:05:29.458 EAL: Trying to obtain current memory policy. 00:05:29.458 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.458 EAL: Restoring previous memory policy: 0 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was expanded by 2MB 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Mem event callback 'spdk:(nil)' registered 00:05:29.458 00:05:29.458 00:05:29.458 CUnit - A unit testing framework for C - Version 2.1-3 00:05:29.458 http://cunit.sourceforge.net/ 00:05:29.458 00:05:29.458 00:05:29.458 Suite: components_suite 00:05:29.458 Test: vtophys_malloc_test ...passed 00:05:29.458 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:29.458 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.458 EAL: Restoring previous memory policy: 4 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was expanded by 4MB 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was shrunk by 4MB 00:05:29.458 EAL: Trying to obtain current memory policy. 00:05:29.458 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.458 EAL: Restoring previous memory policy: 4 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was expanded by 6MB 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was shrunk by 6MB 00:05:29.458 EAL: Trying to obtain current memory policy. 00:05:29.458 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.458 EAL: Restoring previous memory policy: 4 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was expanded by 10MB 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was shrunk by 10MB 00:05:29.458 EAL: Trying to obtain current memory policy. 00:05:29.458 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.458 EAL: Restoring previous memory policy: 4 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was expanded by 18MB 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was shrunk by 18MB 00:05:29.458 EAL: Trying to obtain current memory policy. 00:05:29.458 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.458 EAL: Restoring previous memory policy: 4 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was expanded by 34MB 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was shrunk by 34MB 00:05:29.458 EAL: Trying to obtain current memory policy. 00:05:29.458 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.458 EAL: Restoring previous memory policy: 4 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.458 EAL: request: mp_malloc_sync 00:05:29.458 EAL: No shared files mode enabled, IPC is disabled 00:05:29.458 EAL: Heap on socket 0 was expanded by 66MB 00:05:29.458 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.718 EAL: request: mp_malloc_sync 00:05:29.718 EAL: No shared files mode enabled, IPC is disabled 00:05:29.718 EAL: Heap on socket 0 was shrunk by 66MB 00:05:29.718 EAL: Trying to obtain current memory policy. 00:05:29.718 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.718 EAL: Restoring previous memory policy: 4 00:05:29.718 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.718 EAL: request: mp_malloc_sync 00:05:29.718 EAL: No shared files mode enabled, IPC is disabled 00:05:29.718 EAL: Heap on socket 0 was expanded by 130MB 00:05:29.718 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.718 EAL: request: mp_malloc_sync 00:05:29.718 EAL: No shared files mode enabled, IPC is disabled 00:05:29.718 EAL: Heap on socket 0 was shrunk by 130MB 00:05:29.718 EAL: Trying to obtain current memory policy. 00:05:29.718 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.718 EAL: Restoring previous memory policy: 4 00:05:29.718 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.718 EAL: request: mp_malloc_sync 00:05:29.718 EAL: No shared files mode enabled, IPC is disabled 00:05:29.718 EAL: Heap on socket 0 was expanded by 258MB 00:05:29.718 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.718 EAL: request: mp_malloc_sync 00:05:29.718 EAL: No shared files mode enabled, IPC is disabled 00:05:29.718 EAL: Heap on socket 0 was shrunk by 258MB 00:05:29.718 EAL: Trying to obtain current memory policy. 00:05:29.718 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.977 EAL: Restoring previous memory policy: 4 00:05:29.977 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.977 EAL: request: mp_malloc_sync 00:05:29.977 EAL: No shared files mode enabled, IPC is disabled 00:05:29.977 EAL: Heap on socket 0 was expanded by 514MB 00:05:29.977 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.977 EAL: request: mp_malloc_sync 00:05:29.977 EAL: No shared files mode enabled, IPC is disabled 00:05:29.977 EAL: Heap on socket 0 was shrunk by 514MB 00:05:29.977 EAL: Trying to obtain current memory policy. 00:05:29.977 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:30.235 EAL: Restoring previous memory policy: 4 00:05:30.235 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.235 EAL: request: mp_malloc_sync 00:05:30.235 EAL: No shared files mode enabled, IPC is disabled 00:05:30.235 EAL: Heap on socket 0 was expanded by 1026MB 00:05:30.494 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.494 EAL: request: mp_malloc_sync 00:05:30.494 EAL: No shared files mode enabled, IPC is disabled 00:05:30.494 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:30.494 passed 00:05:30.494 00:05:30.494 Run Summary: Type Total Ran Passed Failed Inactive 00:05:30.494 suites 1 1 n/a 0 0 00:05:30.494 tests 2 2 2 0 0 00:05:30.494 asserts 497 497 497 0 n/a 00:05:30.494 00:05:30.494 Elapsed time = 0.961 seconds 00:05:30.494 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.494 EAL: request: mp_malloc_sync 00:05:30.494 EAL: No shared files mode enabled, IPC is disabled 00:05:30.494 EAL: Heap on socket 0 was shrunk by 2MB 00:05:30.494 EAL: No shared files mode enabled, IPC is disabled 00:05:30.494 EAL: No shared files mode enabled, IPC is disabled 00:05:30.494 EAL: No shared files mode enabled, IPC is disabled 00:05:30.494 00:05:30.494 real 0m1.073s 00:05:30.494 user 0m0.625s 00:05:30.494 sys 0m0.426s 00:05:30.494 02:53:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.494 02:53:25 -- common/autotest_common.sh@10 -- # set +x 00:05:30.494 ************************************ 00:05:30.494 END TEST env_vtophys 00:05:30.494 ************************************ 00:05:30.494 02:53:25 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:30.494 02:53:25 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:30.494 02:53:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.494 02:53:25 -- common/autotest_common.sh@10 -- # set +x 00:05:30.494 ************************************ 00:05:30.494 START TEST env_pci 00:05:30.494 ************************************ 00:05:30.494 02:53:25 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:30.494 00:05:30.494 00:05:30.494 CUnit - A unit testing framework for C - Version 2.1-3 00:05:30.494 http://cunit.sourceforge.net/ 00:05:30.494 00:05:30.494 00:05:30.494 Suite: pci 00:05:30.495 Test: pci_hook ...[2024-07-14 02:53:25.716550] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 643052 has claimed it 00:05:30.754 EAL: Cannot find device (10000:00:01.0) 00:05:30.754 EAL: Failed to attach device on primary process 00:05:30.754 passed 00:05:30.754 00:05:30.754 Run Summary: Type Total Ran Passed Failed Inactive 00:05:30.754 suites 1 1 n/a 0 0 00:05:30.754 tests 1 1 1 0 0 00:05:30.754 asserts 25 25 25 0 n/a 00:05:30.754 00:05:30.754 Elapsed time = 0.033 seconds 00:05:30.754 00:05:30.754 real 0m0.050s 00:05:30.754 user 0m0.017s 00:05:30.754 sys 0m0.033s 00:05:30.754 02:53:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.754 02:53:25 -- common/autotest_common.sh@10 -- # set +x 00:05:30.754 ************************************ 00:05:30.754 END TEST env_pci 00:05:30.754 ************************************ 00:05:30.754 02:53:25 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:30.754 02:53:25 -- env/env.sh@15 -- # uname 00:05:30.754 02:53:25 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:30.754 02:53:25 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:30.754 02:53:25 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:30.754 02:53:25 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:05:30.754 02:53:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.754 02:53:25 -- common/autotest_common.sh@10 -- # set +x 00:05:30.754 ************************************ 00:05:30.754 START TEST env_dpdk_post_init 00:05:30.754 ************************************ 00:05:30.754 02:53:25 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:30.754 EAL: Detected CPU lcores: 112 00:05:30.754 EAL: Detected NUMA nodes: 2 00:05:30.754 EAL: Detected static linkage of DPDK 00:05:30.754 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:30.754 EAL: Selected IOVA mode 'VA' 00:05:30.754 EAL: No free 2048 kB hugepages reported on node 1 00:05:30.754 EAL: VFIO support initialized 00:05:30.754 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:30.754 EAL: Using IOMMU type 1 (Type 1) 00:05:31.692 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:34.982 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:34.982 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:35.551 Starting DPDK initialization... 00:05:35.551 Starting SPDK post initialization... 00:05:35.551 SPDK NVMe probe 00:05:35.551 Attaching to 0000:d8:00.0 00:05:35.551 Attached to 0000:d8:00.0 00:05:35.551 Cleaning up... 00:05:35.551 00:05:35.551 real 0m4.745s 00:05:35.551 user 0m3.561s 00:05:35.551 sys 0m0.429s 00:05:35.551 02:53:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.551 02:53:30 -- common/autotest_common.sh@10 -- # set +x 00:05:35.551 ************************************ 00:05:35.551 END TEST env_dpdk_post_init 00:05:35.551 ************************************ 00:05:35.551 02:53:30 -- env/env.sh@26 -- # uname 00:05:35.551 02:53:30 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:35.551 02:53:30 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:35.551 02:53:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.551 02:53:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.551 02:53:30 -- common/autotest_common.sh@10 -- # set +x 00:05:35.551 ************************************ 00:05:35.551 START TEST env_mem_callbacks 00:05:35.551 ************************************ 00:05:35.551 02:53:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:35.551 EAL: Detected CPU lcores: 112 00:05:35.551 EAL: Detected NUMA nodes: 2 00:05:35.551 EAL: Detected static linkage of DPDK 00:05:35.551 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:35.551 EAL: Selected IOVA mode 'VA' 00:05:35.551 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.551 EAL: VFIO support initialized 00:05:35.551 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:35.551 00:05:35.551 00:05:35.551 CUnit - A unit testing framework for C - Version 2.1-3 00:05:35.551 http://cunit.sourceforge.net/ 00:05:35.551 00:05:35.551 00:05:35.551 Suite: memory 00:05:35.551 Test: test ... 00:05:35.551 register 0x200000200000 2097152 00:05:35.551 malloc 3145728 00:05:35.551 register 0x200000400000 4194304 00:05:35.551 buf 0x200000500000 len 3145728 PASSED 00:05:35.551 malloc 64 00:05:35.551 buf 0x2000004fff40 len 64 PASSED 00:05:35.551 malloc 4194304 00:05:35.551 register 0x200000800000 6291456 00:05:35.551 buf 0x200000a00000 len 4194304 PASSED 00:05:35.551 free 0x200000500000 3145728 00:05:35.551 free 0x2000004fff40 64 00:05:35.551 unregister 0x200000400000 4194304 PASSED 00:05:35.551 free 0x200000a00000 4194304 00:05:35.551 unregister 0x200000800000 6291456 PASSED 00:05:35.551 malloc 8388608 00:05:35.551 register 0x200000400000 10485760 00:05:35.551 buf 0x200000600000 len 8388608 PASSED 00:05:35.551 free 0x200000600000 8388608 00:05:35.551 unregister 0x200000400000 10485760 PASSED 00:05:35.551 passed 00:05:35.551 00:05:35.551 Run Summary: Type Total Ran Passed Failed Inactive 00:05:35.551 suites 1 1 n/a 0 0 00:05:35.551 tests 1 1 1 0 0 00:05:35.551 asserts 15 15 15 0 n/a 00:05:35.551 00:05:35.551 Elapsed time = 0.005 seconds 00:05:35.551 00:05:35.551 real 0m0.061s 00:05:35.551 user 0m0.019s 00:05:35.551 sys 0m0.042s 00:05:35.551 02:53:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.551 02:53:30 -- common/autotest_common.sh@10 -- # set +x 00:05:35.551 ************************************ 00:05:35.551 END TEST env_mem_callbacks 00:05:35.551 ************************************ 00:05:35.551 00:05:35.551 real 0m6.388s 00:05:35.551 user 0m4.436s 00:05:35.551 sys 0m1.222s 00:05:35.551 02:53:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.551 02:53:30 -- common/autotest_common.sh@10 -- # set +x 00:05:35.551 ************************************ 00:05:35.551 END TEST env 00:05:35.551 ************************************ 00:05:35.551 02:53:30 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:35.551 02:53:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.551 02:53:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.551 02:53:30 -- common/autotest_common.sh@10 -- # set +x 00:05:35.551 ************************************ 00:05:35.551 START TEST rpc 00:05:35.551 ************************************ 00:05:35.551 02:53:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:35.811 * Looking for test storage... 00:05:35.811 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:35.811 02:53:30 -- rpc/rpc.sh@65 -- # spdk_pid=644078 00:05:35.811 02:53:30 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:35.811 02:53:30 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:35.811 02:53:30 -- rpc/rpc.sh@67 -- # waitforlisten 644078 00:05:35.811 02:53:30 -- common/autotest_common.sh@819 -- # '[' -z 644078 ']' 00:05:35.811 02:53:30 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.811 02:53:30 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:35.811 02:53:30 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.811 02:53:30 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:35.811 02:53:30 -- common/autotest_common.sh@10 -- # set +x 00:05:35.811 [2024-07-14 02:53:30.860935] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:35.811 [2024-07-14 02:53:30.860999] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644078 ] 00:05:35.811 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.811 [2024-07-14 02:53:30.927001] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.811 [2024-07-14 02:53:30.963537] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:35.811 [2024-07-14 02:53:30.963659] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:35.811 [2024-07-14 02:53:30.963670] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 644078' to capture a snapshot of events at runtime. 00:05:35.811 [2024-07-14 02:53:30.963679] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid644078 for offline analysis/debug. 00:05:35.811 [2024-07-14 02:53:30.963698] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.750 02:53:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:36.750 02:53:31 -- common/autotest_common.sh@852 -- # return 0 00:05:36.750 02:53:31 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:36.750 02:53:31 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:36.750 02:53:31 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:36.750 02:53:31 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:36.750 02:53:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.750 02:53:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.750 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.750 ************************************ 00:05:36.750 START TEST rpc_integrity 00:05:36.750 ************************************ 00:05:36.750 02:53:31 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:36.750 02:53:31 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:36.750 02:53:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.750 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.750 02:53:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.750 02:53:31 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:36.750 02:53:31 -- rpc/rpc.sh@13 -- # jq length 00:05:36.750 02:53:31 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:36.750 02:53:31 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:36.750 02:53:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.750 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.750 02:53:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.750 02:53:31 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:36.750 02:53:31 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:36.750 02:53:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.750 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.750 02:53:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.750 02:53:31 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:36.750 { 00:05:36.750 "name": "Malloc0", 00:05:36.750 "aliases": [ 00:05:36.750 "32172e58-9c6a-4d7c-9ac5-8372279793bd" 00:05:36.750 ], 00:05:36.750 "product_name": "Malloc disk", 00:05:36.750 "block_size": 512, 00:05:36.750 "num_blocks": 16384, 00:05:36.750 "uuid": "32172e58-9c6a-4d7c-9ac5-8372279793bd", 00:05:36.750 "assigned_rate_limits": { 00:05:36.750 "rw_ios_per_sec": 0, 00:05:36.750 "rw_mbytes_per_sec": 0, 00:05:36.750 "r_mbytes_per_sec": 0, 00:05:36.750 "w_mbytes_per_sec": 0 00:05:36.750 }, 00:05:36.750 "claimed": false, 00:05:36.750 "zoned": false, 00:05:36.750 "supported_io_types": { 00:05:36.750 "read": true, 00:05:36.750 "write": true, 00:05:36.750 "unmap": true, 00:05:36.750 "write_zeroes": true, 00:05:36.750 "flush": true, 00:05:36.750 "reset": true, 00:05:36.750 "compare": false, 00:05:36.750 "compare_and_write": false, 00:05:36.750 "abort": true, 00:05:36.750 "nvme_admin": false, 00:05:36.750 "nvme_io": false 00:05:36.750 }, 00:05:36.750 "memory_domains": [ 00:05:36.750 { 00:05:36.750 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.750 "dma_device_type": 2 00:05:36.750 } 00:05:36.750 ], 00:05:36.750 "driver_specific": {} 00:05:36.750 } 00:05:36.750 ]' 00:05:36.750 02:53:31 -- rpc/rpc.sh@17 -- # jq length 00:05:36.750 02:53:31 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:36.751 02:53:31 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:36.751 02:53:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.751 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.751 [2024-07-14 02:53:31.805484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:36.751 [2024-07-14 02:53:31.805519] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:36.751 [2024-07-14 02:53:31.805539] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4c92230 00:05:36.751 [2024-07-14 02:53:31.805551] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:36.751 [2024-07-14 02:53:31.806408] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:36.751 [2024-07-14 02:53:31.806430] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:36.751 Passthru0 00:05:36.751 02:53:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.751 02:53:31 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:36.751 02:53:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.751 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.751 02:53:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.751 02:53:31 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:36.751 { 00:05:36.751 "name": "Malloc0", 00:05:36.751 "aliases": [ 00:05:36.751 "32172e58-9c6a-4d7c-9ac5-8372279793bd" 00:05:36.751 ], 00:05:36.751 "product_name": "Malloc disk", 00:05:36.751 "block_size": 512, 00:05:36.751 "num_blocks": 16384, 00:05:36.751 "uuid": "32172e58-9c6a-4d7c-9ac5-8372279793bd", 00:05:36.751 "assigned_rate_limits": { 00:05:36.751 "rw_ios_per_sec": 0, 00:05:36.751 "rw_mbytes_per_sec": 0, 00:05:36.751 "r_mbytes_per_sec": 0, 00:05:36.751 "w_mbytes_per_sec": 0 00:05:36.751 }, 00:05:36.751 "claimed": true, 00:05:36.751 "claim_type": "exclusive_write", 00:05:36.751 "zoned": false, 00:05:36.751 "supported_io_types": { 00:05:36.751 "read": true, 00:05:36.751 "write": true, 00:05:36.751 "unmap": true, 00:05:36.751 "write_zeroes": true, 00:05:36.751 "flush": true, 00:05:36.751 "reset": true, 00:05:36.751 "compare": false, 00:05:36.751 "compare_and_write": false, 00:05:36.751 "abort": true, 00:05:36.751 "nvme_admin": false, 00:05:36.751 "nvme_io": false 00:05:36.751 }, 00:05:36.751 "memory_domains": [ 00:05:36.751 { 00:05:36.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.751 "dma_device_type": 2 00:05:36.751 } 00:05:36.751 ], 00:05:36.751 "driver_specific": {} 00:05:36.751 }, 00:05:36.751 { 00:05:36.751 "name": "Passthru0", 00:05:36.751 "aliases": [ 00:05:36.751 "f955adf4-07e3-57da-9e00-cab181fa873f" 00:05:36.751 ], 00:05:36.751 "product_name": "passthru", 00:05:36.751 "block_size": 512, 00:05:36.751 "num_blocks": 16384, 00:05:36.751 "uuid": "f955adf4-07e3-57da-9e00-cab181fa873f", 00:05:36.751 "assigned_rate_limits": { 00:05:36.751 "rw_ios_per_sec": 0, 00:05:36.751 "rw_mbytes_per_sec": 0, 00:05:36.751 "r_mbytes_per_sec": 0, 00:05:36.751 "w_mbytes_per_sec": 0 00:05:36.751 }, 00:05:36.751 "claimed": false, 00:05:36.751 "zoned": false, 00:05:36.751 "supported_io_types": { 00:05:36.751 "read": true, 00:05:36.751 "write": true, 00:05:36.751 "unmap": true, 00:05:36.751 "write_zeroes": true, 00:05:36.751 "flush": true, 00:05:36.751 "reset": true, 00:05:36.751 "compare": false, 00:05:36.751 "compare_and_write": false, 00:05:36.751 "abort": true, 00:05:36.751 "nvme_admin": false, 00:05:36.751 "nvme_io": false 00:05:36.751 }, 00:05:36.751 "memory_domains": [ 00:05:36.751 { 00:05:36.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.751 "dma_device_type": 2 00:05:36.751 } 00:05:36.751 ], 00:05:36.751 "driver_specific": { 00:05:36.751 "passthru": { 00:05:36.751 "name": "Passthru0", 00:05:36.751 "base_bdev_name": "Malloc0" 00:05:36.751 } 00:05:36.751 } 00:05:36.751 } 00:05:36.751 ]' 00:05:36.751 02:53:31 -- rpc/rpc.sh@21 -- # jq length 00:05:36.751 02:53:31 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:36.751 02:53:31 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:36.751 02:53:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.751 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.751 02:53:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.751 02:53:31 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:36.751 02:53:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.751 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.751 02:53:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.751 02:53:31 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:36.751 02:53:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.751 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.751 02:53:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.751 02:53:31 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:36.751 02:53:31 -- rpc/rpc.sh@26 -- # jq length 00:05:36.751 02:53:31 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:36.751 00:05:36.751 real 0m0.295s 00:05:36.751 user 0m0.173s 00:05:36.751 sys 0m0.056s 00:05:36.751 02:53:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.751 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.751 ************************************ 00:05:36.751 END TEST rpc_integrity 00:05:36.751 ************************************ 00:05:36.751 02:53:31 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:36.751 02:53:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.751 02:53:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.751 02:53:31 -- common/autotest_common.sh@10 -- # set +x 00:05:36.751 ************************************ 00:05:36.751 START TEST rpc_plugins 00:05:36.751 ************************************ 00:05:36.751 02:53:31 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:37.010 02:53:32 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:37.010 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.010 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.010 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.011 02:53:32 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:37.011 02:53:32 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:37.011 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.011 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.011 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.011 02:53:32 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:37.011 { 00:05:37.011 "name": "Malloc1", 00:05:37.011 "aliases": [ 00:05:37.011 "a9c13392-1e91-453e-a5a7-013a1b938eab" 00:05:37.011 ], 00:05:37.011 "product_name": "Malloc disk", 00:05:37.011 "block_size": 4096, 00:05:37.011 "num_blocks": 256, 00:05:37.011 "uuid": "a9c13392-1e91-453e-a5a7-013a1b938eab", 00:05:37.011 "assigned_rate_limits": { 00:05:37.011 "rw_ios_per_sec": 0, 00:05:37.011 "rw_mbytes_per_sec": 0, 00:05:37.011 "r_mbytes_per_sec": 0, 00:05:37.011 "w_mbytes_per_sec": 0 00:05:37.011 }, 00:05:37.011 "claimed": false, 00:05:37.011 "zoned": false, 00:05:37.011 "supported_io_types": { 00:05:37.011 "read": true, 00:05:37.011 "write": true, 00:05:37.011 "unmap": true, 00:05:37.011 "write_zeroes": true, 00:05:37.011 "flush": true, 00:05:37.011 "reset": true, 00:05:37.011 "compare": false, 00:05:37.011 "compare_and_write": false, 00:05:37.011 "abort": true, 00:05:37.011 "nvme_admin": false, 00:05:37.011 "nvme_io": false 00:05:37.011 }, 00:05:37.011 "memory_domains": [ 00:05:37.011 { 00:05:37.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:37.011 "dma_device_type": 2 00:05:37.011 } 00:05:37.011 ], 00:05:37.011 "driver_specific": {} 00:05:37.011 } 00:05:37.011 ]' 00:05:37.011 02:53:32 -- rpc/rpc.sh@32 -- # jq length 00:05:37.011 02:53:32 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:37.011 02:53:32 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:37.011 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.011 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.011 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.011 02:53:32 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:37.011 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.011 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.011 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.011 02:53:32 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:37.011 02:53:32 -- rpc/rpc.sh@36 -- # jq length 00:05:37.011 02:53:32 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:37.011 00:05:37.011 real 0m0.139s 00:05:37.011 user 0m0.087s 00:05:37.011 sys 0m0.019s 00:05:37.011 02:53:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.011 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.011 ************************************ 00:05:37.011 END TEST rpc_plugins 00:05:37.011 ************************************ 00:05:37.011 02:53:32 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:37.011 02:53:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.011 02:53:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.011 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.011 ************************************ 00:05:37.011 START TEST rpc_trace_cmd_test 00:05:37.011 ************************************ 00:05:37.011 02:53:32 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:37.011 02:53:32 -- rpc/rpc.sh@40 -- # local info 00:05:37.011 02:53:32 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:37.011 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.011 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.011 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.011 02:53:32 -- rpc/rpc.sh@42 -- # info='{ 00:05:37.011 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid644078", 00:05:37.011 "tpoint_group_mask": "0x8", 00:05:37.011 "iscsi_conn": { 00:05:37.011 "mask": "0x2", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "scsi": { 00:05:37.011 "mask": "0x4", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "bdev": { 00:05:37.011 "mask": "0x8", 00:05:37.011 "tpoint_mask": "0xffffffffffffffff" 00:05:37.011 }, 00:05:37.011 "nvmf_rdma": { 00:05:37.011 "mask": "0x10", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "nvmf_tcp": { 00:05:37.011 "mask": "0x20", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "ftl": { 00:05:37.011 "mask": "0x40", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "blobfs": { 00:05:37.011 "mask": "0x80", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "dsa": { 00:05:37.011 "mask": "0x200", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "thread": { 00:05:37.011 "mask": "0x400", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "nvme_pcie": { 00:05:37.011 "mask": "0x800", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "iaa": { 00:05:37.011 "mask": "0x1000", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "nvme_tcp": { 00:05:37.011 "mask": "0x2000", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 }, 00:05:37.011 "bdev_nvme": { 00:05:37.011 "mask": "0x4000", 00:05:37.011 "tpoint_mask": "0x0" 00:05:37.011 } 00:05:37.011 }' 00:05:37.011 02:53:32 -- rpc/rpc.sh@43 -- # jq length 00:05:37.011 02:53:32 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:37.011 02:53:32 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:37.270 02:53:32 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:37.270 02:53:32 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:37.270 02:53:32 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:37.270 02:53:32 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:37.270 02:53:32 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:37.270 02:53:32 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:37.270 02:53:32 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:37.270 00:05:37.270 real 0m0.214s 00:05:37.270 user 0m0.171s 00:05:37.270 sys 0m0.035s 00:05:37.270 02:53:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.270 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.270 ************************************ 00:05:37.270 END TEST rpc_trace_cmd_test 00:05:37.270 ************************************ 00:05:37.270 02:53:32 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:37.270 02:53:32 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:37.270 02:53:32 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:37.270 02:53:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.270 02:53:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.270 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.270 ************************************ 00:05:37.270 START TEST rpc_daemon_integrity 00:05:37.270 ************************************ 00:05:37.270 02:53:32 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:37.270 02:53:32 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:37.270 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.270 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.270 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.270 02:53:32 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:37.270 02:53:32 -- rpc/rpc.sh@13 -- # jq length 00:05:37.270 02:53:32 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:37.270 02:53:32 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:37.270 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.270 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.270 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.270 02:53:32 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:37.270 02:53:32 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:37.270 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.270 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.530 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.530 02:53:32 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:37.530 { 00:05:37.530 "name": "Malloc2", 00:05:37.530 "aliases": [ 00:05:37.530 "0ebd2dee-951d-441a-bbe4-7ac539b1019d" 00:05:37.530 ], 00:05:37.530 "product_name": "Malloc disk", 00:05:37.530 "block_size": 512, 00:05:37.530 "num_blocks": 16384, 00:05:37.530 "uuid": "0ebd2dee-951d-441a-bbe4-7ac539b1019d", 00:05:37.530 "assigned_rate_limits": { 00:05:37.530 "rw_ios_per_sec": 0, 00:05:37.530 "rw_mbytes_per_sec": 0, 00:05:37.530 "r_mbytes_per_sec": 0, 00:05:37.530 "w_mbytes_per_sec": 0 00:05:37.530 }, 00:05:37.530 "claimed": false, 00:05:37.530 "zoned": false, 00:05:37.530 "supported_io_types": { 00:05:37.530 "read": true, 00:05:37.530 "write": true, 00:05:37.530 "unmap": true, 00:05:37.530 "write_zeroes": true, 00:05:37.530 "flush": true, 00:05:37.530 "reset": true, 00:05:37.530 "compare": false, 00:05:37.530 "compare_and_write": false, 00:05:37.530 "abort": true, 00:05:37.530 "nvme_admin": false, 00:05:37.530 "nvme_io": false 00:05:37.530 }, 00:05:37.530 "memory_domains": [ 00:05:37.530 { 00:05:37.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:37.530 "dma_device_type": 2 00:05:37.530 } 00:05:37.530 ], 00:05:37.530 "driver_specific": {} 00:05:37.530 } 00:05:37.530 ]' 00:05:37.530 02:53:32 -- rpc/rpc.sh@17 -- # jq length 00:05:37.530 02:53:32 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:37.530 02:53:32 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:37.530 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.530 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.530 [2024-07-14 02:53:32.587487] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:37.530 [2024-07-14 02:53:32.587519] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:37.530 [2024-07-14 02:53:32.587541] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4afa8d0 00:05:37.530 [2024-07-14 02:53:32.587550] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:37.530 [2024-07-14 02:53:32.588273] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:37.530 [2024-07-14 02:53:32.588303] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:37.530 Passthru0 00:05:37.530 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.530 02:53:32 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:37.530 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.530 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.530 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.530 02:53:32 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:37.530 { 00:05:37.530 "name": "Malloc2", 00:05:37.530 "aliases": [ 00:05:37.530 "0ebd2dee-951d-441a-bbe4-7ac539b1019d" 00:05:37.530 ], 00:05:37.530 "product_name": "Malloc disk", 00:05:37.530 "block_size": 512, 00:05:37.530 "num_blocks": 16384, 00:05:37.530 "uuid": "0ebd2dee-951d-441a-bbe4-7ac539b1019d", 00:05:37.530 "assigned_rate_limits": { 00:05:37.530 "rw_ios_per_sec": 0, 00:05:37.530 "rw_mbytes_per_sec": 0, 00:05:37.530 "r_mbytes_per_sec": 0, 00:05:37.530 "w_mbytes_per_sec": 0 00:05:37.530 }, 00:05:37.530 "claimed": true, 00:05:37.530 "claim_type": "exclusive_write", 00:05:37.530 "zoned": false, 00:05:37.530 "supported_io_types": { 00:05:37.530 "read": true, 00:05:37.530 "write": true, 00:05:37.530 "unmap": true, 00:05:37.530 "write_zeroes": true, 00:05:37.530 "flush": true, 00:05:37.530 "reset": true, 00:05:37.530 "compare": false, 00:05:37.530 "compare_and_write": false, 00:05:37.530 "abort": true, 00:05:37.530 "nvme_admin": false, 00:05:37.530 "nvme_io": false 00:05:37.530 }, 00:05:37.530 "memory_domains": [ 00:05:37.530 { 00:05:37.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:37.530 "dma_device_type": 2 00:05:37.530 } 00:05:37.530 ], 00:05:37.530 "driver_specific": {} 00:05:37.530 }, 00:05:37.530 { 00:05:37.530 "name": "Passthru0", 00:05:37.530 "aliases": [ 00:05:37.530 "a35f2ecf-2df1-5387-9b2e-80547a0fb1f9" 00:05:37.530 ], 00:05:37.530 "product_name": "passthru", 00:05:37.530 "block_size": 512, 00:05:37.530 "num_blocks": 16384, 00:05:37.530 "uuid": "a35f2ecf-2df1-5387-9b2e-80547a0fb1f9", 00:05:37.530 "assigned_rate_limits": { 00:05:37.530 "rw_ios_per_sec": 0, 00:05:37.530 "rw_mbytes_per_sec": 0, 00:05:37.530 "r_mbytes_per_sec": 0, 00:05:37.530 "w_mbytes_per_sec": 0 00:05:37.530 }, 00:05:37.530 "claimed": false, 00:05:37.530 "zoned": false, 00:05:37.530 "supported_io_types": { 00:05:37.530 "read": true, 00:05:37.530 "write": true, 00:05:37.530 "unmap": true, 00:05:37.530 "write_zeroes": true, 00:05:37.530 "flush": true, 00:05:37.530 "reset": true, 00:05:37.530 "compare": false, 00:05:37.530 "compare_and_write": false, 00:05:37.530 "abort": true, 00:05:37.530 "nvme_admin": false, 00:05:37.530 "nvme_io": false 00:05:37.530 }, 00:05:37.530 "memory_domains": [ 00:05:37.530 { 00:05:37.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:37.530 "dma_device_type": 2 00:05:37.530 } 00:05:37.530 ], 00:05:37.530 "driver_specific": { 00:05:37.530 "passthru": { 00:05:37.530 "name": "Passthru0", 00:05:37.530 "base_bdev_name": "Malloc2" 00:05:37.530 } 00:05:37.530 } 00:05:37.530 } 00:05:37.530 ]' 00:05:37.530 02:53:32 -- rpc/rpc.sh@21 -- # jq length 00:05:37.530 02:53:32 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:37.530 02:53:32 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:37.530 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.530 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.530 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.530 02:53:32 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:37.530 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.530 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.530 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.530 02:53:32 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:37.530 02:53:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.530 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.530 02:53:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.530 02:53:32 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:37.530 02:53:32 -- rpc/rpc.sh@26 -- # jq length 00:05:37.530 02:53:32 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:37.530 00:05:37.530 real 0m0.286s 00:05:37.530 user 0m0.175s 00:05:37.530 sys 0m0.044s 00:05:37.530 02:53:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.530 02:53:32 -- common/autotest_common.sh@10 -- # set +x 00:05:37.530 ************************************ 00:05:37.530 END TEST rpc_daemon_integrity 00:05:37.530 ************************************ 00:05:37.531 02:53:32 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:37.531 02:53:32 -- rpc/rpc.sh@84 -- # killprocess 644078 00:05:37.531 02:53:32 -- common/autotest_common.sh@926 -- # '[' -z 644078 ']' 00:05:37.531 02:53:32 -- common/autotest_common.sh@930 -- # kill -0 644078 00:05:37.531 02:53:32 -- common/autotest_common.sh@931 -- # uname 00:05:37.790 02:53:32 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:37.790 02:53:32 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 644078 00:05:37.790 02:53:32 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:37.790 02:53:32 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:37.790 02:53:32 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 644078' 00:05:37.790 killing process with pid 644078 00:05:37.790 02:53:32 -- common/autotest_common.sh@945 -- # kill 644078 00:05:37.790 02:53:32 -- common/autotest_common.sh@950 -- # wait 644078 00:05:38.049 00:05:38.049 real 0m2.368s 00:05:38.049 user 0m2.986s 00:05:38.049 sys 0m0.717s 00:05:38.049 02:53:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:38.049 02:53:33 -- common/autotest_common.sh@10 -- # set +x 00:05:38.049 ************************************ 00:05:38.049 END TEST rpc 00:05:38.049 ************************************ 00:05:38.049 02:53:33 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:38.049 02:53:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:38.049 02:53:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:38.049 02:53:33 -- common/autotest_common.sh@10 -- # set +x 00:05:38.049 ************************************ 00:05:38.049 START TEST rpc_client 00:05:38.049 ************************************ 00:05:38.049 02:53:33 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:38.049 * Looking for test storage... 00:05:38.049 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:38.049 02:53:33 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:38.049 OK 00:05:38.049 02:53:33 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:38.049 00:05:38.049 real 0m0.111s 00:05:38.049 user 0m0.051s 00:05:38.049 sys 0m0.069s 00:05:38.049 02:53:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:38.049 02:53:33 -- common/autotest_common.sh@10 -- # set +x 00:05:38.049 ************************************ 00:05:38.049 END TEST rpc_client 00:05:38.049 ************************************ 00:05:38.307 02:53:33 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:38.307 02:53:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:38.307 02:53:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:38.307 02:53:33 -- common/autotest_common.sh@10 -- # set +x 00:05:38.307 ************************************ 00:05:38.307 START TEST json_config 00:05:38.307 ************************************ 00:05:38.307 02:53:33 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:38.307 02:53:33 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:38.307 02:53:33 -- nvmf/common.sh@7 -- # uname -s 00:05:38.307 02:53:33 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:38.307 02:53:33 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:38.307 02:53:33 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:38.307 02:53:33 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:38.307 02:53:33 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:38.307 02:53:33 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:38.307 02:53:33 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:38.307 02:53:33 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:38.307 02:53:33 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:38.307 02:53:33 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:38.307 02:53:33 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:38.307 02:53:33 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:38.307 02:53:33 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:38.307 02:53:33 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:38.307 02:53:33 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:38.307 02:53:33 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:38.307 02:53:33 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:38.307 02:53:33 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:38.307 02:53:33 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:38.307 02:53:33 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.307 02:53:33 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.307 02:53:33 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.307 02:53:33 -- paths/export.sh@5 -- # export PATH 00:05:38.307 02:53:33 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.307 02:53:33 -- nvmf/common.sh@46 -- # : 0 00:05:38.307 02:53:33 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:38.307 02:53:33 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:38.307 02:53:33 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:38.307 02:53:33 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:38.307 02:53:33 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:38.307 02:53:33 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:38.307 02:53:33 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:38.307 02:53:33 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:38.307 02:53:33 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:38.307 02:53:33 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:38.307 02:53:33 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:38.307 02:53:33 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:38.307 02:53:33 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:38.307 WARNING: No tests are enabled so not running JSON configuration tests 00:05:38.307 02:53:33 -- json_config/json_config.sh@27 -- # exit 0 00:05:38.307 00:05:38.307 real 0m0.102s 00:05:38.307 user 0m0.041s 00:05:38.307 sys 0m0.061s 00:05:38.307 02:53:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:38.307 02:53:33 -- common/autotest_common.sh@10 -- # set +x 00:05:38.307 ************************************ 00:05:38.307 END TEST json_config 00:05:38.307 ************************************ 00:05:38.307 02:53:33 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:38.307 02:53:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:38.307 02:53:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:38.307 02:53:33 -- common/autotest_common.sh@10 -- # set +x 00:05:38.307 ************************************ 00:05:38.307 START TEST json_config_extra_key 00:05:38.307 ************************************ 00:05:38.308 02:53:33 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:38.308 02:53:33 -- nvmf/common.sh@7 -- # uname -s 00:05:38.308 02:53:33 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:38.308 02:53:33 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:38.308 02:53:33 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:38.308 02:53:33 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:38.308 02:53:33 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:38.308 02:53:33 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:38.308 02:53:33 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:38.308 02:53:33 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:38.308 02:53:33 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:38.308 02:53:33 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:38.308 02:53:33 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:38.308 02:53:33 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:38.308 02:53:33 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:38.308 02:53:33 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:38.308 02:53:33 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:38.308 02:53:33 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:38.308 02:53:33 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:38.308 02:53:33 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:38.308 02:53:33 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:38.308 02:53:33 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.308 02:53:33 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.308 02:53:33 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.308 02:53:33 -- paths/export.sh@5 -- # export PATH 00:05:38.308 02:53:33 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.308 02:53:33 -- nvmf/common.sh@46 -- # : 0 00:05:38.308 02:53:33 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:38.308 02:53:33 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:38.308 02:53:33 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:38.308 02:53:33 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:38.308 02:53:33 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:38.308 02:53:33 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:38.308 02:53:33 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:38.308 02:53:33 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:38.308 INFO: launching applications... 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=644837 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:38.308 Waiting for target to run... 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 644837 /var/tmp/spdk_tgt.sock 00:05:38.308 02:53:33 -- common/autotest_common.sh@819 -- # '[' -z 644837 ']' 00:05:38.308 02:53:33 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:38.308 02:53:33 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:38.308 02:53:33 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:38.308 02:53:33 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:38.308 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:38.308 02:53:33 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:38.308 02:53:33 -- common/autotest_common.sh@10 -- # set +x 00:05:38.566 [2024-07-14 02:53:33.573496] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:38.566 [2024-07-14 02:53:33.573574] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644837 ] 00:05:38.566 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.824 [2024-07-14 02:53:34.008293] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.824 [2024-07-14 02:53:34.037401] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:38.824 [2024-07-14 02:53:34.037507] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.393 02:53:34 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:39.393 02:53:34 -- common/autotest_common.sh@852 -- # return 0 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:39.393 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:39.393 INFO: shutting down applications... 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 644837 ]] 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 644837 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@50 -- # kill -0 644837 00:05:39.393 02:53:34 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:39.653 02:53:34 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:39.653 02:53:34 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:39.653 02:53:34 -- json_config/json_config_extra_key.sh@50 -- # kill -0 644837 00:05:39.653 02:53:34 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:39.653 02:53:34 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:39.653 02:53:34 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:39.653 02:53:34 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:39.653 SPDK target shutdown done 00:05:39.653 02:53:34 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:39.653 Success 00:05:39.653 00:05:39.653 real 0m1.399s 00:05:39.653 user 0m0.997s 00:05:39.653 sys 0m0.509s 00:05:39.653 02:53:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:39.653 02:53:34 -- common/autotest_common.sh@10 -- # set +x 00:05:39.653 ************************************ 00:05:39.653 END TEST json_config_extra_key 00:05:39.653 ************************************ 00:05:39.941 02:53:34 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:39.941 02:53:34 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:39.941 02:53:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:39.941 02:53:34 -- common/autotest_common.sh@10 -- # set +x 00:05:39.941 ************************************ 00:05:39.941 START TEST alias_rpc 00:05:39.941 ************************************ 00:05:39.941 02:53:34 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:39.941 * Looking for test storage... 00:05:39.941 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:39.941 02:53:35 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:39.941 02:53:35 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=645122 00:05:39.941 02:53:35 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 645122 00:05:39.941 02:53:35 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:39.942 02:53:35 -- common/autotest_common.sh@819 -- # '[' -z 645122 ']' 00:05:39.942 02:53:35 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:39.942 02:53:35 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:39.942 02:53:35 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:39.942 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:39.942 02:53:35 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:39.942 02:53:35 -- common/autotest_common.sh@10 -- # set +x 00:05:39.942 [2024-07-14 02:53:35.049602] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:39.942 [2024-07-14 02:53:35.049697] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645122 ] 00:05:39.942 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.942 [2024-07-14 02:53:35.118871] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:39.942 [2024-07-14 02:53:35.156355] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:39.942 [2024-07-14 02:53:35.156483] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.914 02:53:35 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:40.914 02:53:35 -- common/autotest_common.sh@852 -- # return 0 00:05:40.914 02:53:35 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:40.914 02:53:36 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 645122 00:05:40.914 02:53:36 -- common/autotest_common.sh@926 -- # '[' -z 645122 ']' 00:05:40.914 02:53:36 -- common/autotest_common.sh@930 -- # kill -0 645122 00:05:40.914 02:53:36 -- common/autotest_common.sh@931 -- # uname 00:05:40.914 02:53:36 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:40.914 02:53:36 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 645122 00:05:40.914 02:53:36 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:40.914 02:53:36 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:40.914 02:53:36 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 645122' 00:05:40.914 killing process with pid 645122 00:05:40.914 02:53:36 -- common/autotest_common.sh@945 -- # kill 645122 00:05:40.914 02:53:36 -- common/autotest_common.sh@950 -- # wait 645122 00:05:41.171 00:05:41.171 real 0m1.472s 00:05:41.171 user 0m1.556s 00:05:41.171 sys 0m0.453s 00:05:41.171 02:53:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:41.171 02:53:36 -- common/autotest_common.sh@10 -- # set +x 00:05:41.171 ************************************ 00:05:41.171 END TEST alias_rpc 00:05:41.171 ************************************ 00:05:41.430 02:53:36 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:41.430 02:53:36 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:41.430 02:53:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:41.430 02:53:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:41.430 02:53:36 -- common/autotest_common.sh@10 -- # set +x 00:05:41.430 ************************************ 00:05:41.430 START TEST spdkcli_tcp 00:05:41.430 ************************************ 00:05:41.430 02:53:36 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:41.430 * Looking for test storage... 00:05:41.430 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:41.430 02:53:36 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:41.430 02:53:36 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:41.430 02:53:36 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:41.430 02:53:36 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:41.430 02:53:36 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:41.430 02:53:36 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:41.430 02:53:36 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:41.430 02:53:36 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:41.430 02:53:36 -- common/autotest_common.sh@10 -- # set +x 00:05:41.430 02:53:36 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=645437 00:05:41.430 02:53:36 -- spdkcli/tcp.sh@27 -- # waitforlisten 645437 00:05:41.430 02:53:36 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:41.430 02:53:36 -- common/autotest_common.sh@819 -- # '[' -z 645437 ']' 00:05:41.430 02:53:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:41.430 02:53:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:41.430 02:53:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:41.430 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:41.430 02:53:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:41.430 02:53:36 -- common/autotest_common.sh@10 -- # set +x 00:05:41.430 [2024-07-14 02:53:36.568941] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:41.430 [2024-07-14 02:53:36.569034] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645437 ] 00:05:41.430 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.430 [2024-07-14 02:53:36.635749] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:41.430 [2024-07-14 02:53:36.673500] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:41.430 [2024-07-14 02:53:36.673638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:41.430 [2024-07-14 02:53:36.673641] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.364 02:53:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:42.364 02:53:37 -- common/autotest_common.sh@852 -- # return 0 00:05:42.364 02:53:37 -- spdkcli/tcp.sh@31 -- # socat_pid=645503 00:05:42.364 02:53:37 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:42.364 02:53:37 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:42.364 [ 00:05:42.364 "spdk_get_version", 00:05:42.364 "rpc_get_methods", 00:05:42.364 "trace_get_info", 00:05:42.364 "trace_get_tpoint_group_mask", 00:05:42.364 "trace_disable_tpoint_group", 00:05:42.364 "trace_enable_tpoint_group", 00:05:42.364 "trace_clear_tpoint_mask", 00:05:42.364 "trace_set_tpoint_mask", 00:05:42.364 "vfu_tgt_set_base_path", 00:05:42.364 "framework_get_pci_devices", 00:05:42.364 "framework_get_config", 00:05:42.364 "framework_get_subsystems", 00:05:42.364 "iobuf_get_stats", 00:05:42.364 "iobuf_set_options", 00:05:42.364 "sock_set_default_impl", 00:05:42.364 "sock_impl_set_options", 00:05:42.364 "sock_impl_get_options", 00:05:42.364 "vmd_rescan", 00:05:42.364 "vmd_remove_device", 00:05:42.364 "vmd_enable", 00:05:42.364 "accel_get_stats", 00:05:42.364 "accel_set_options", 00:05:42.364 "accel_set_driver", 00:05:42.364 "accel_crypto_key_destroy", 00:05:42.364 "accel_crypto_keys_get", 00:05:42.364 "accel_crypto_key_create", 00:05:42.364 "accel_assign_opc", 00:05:42.364 "accel_get_module_info", 00:05:42.364 "accel_get_opc_assignments", 00:05:42.364 "notify_get_notifications", 00:05:42.364 "notify_get_types", 00:05:42.364 "bdev_get_histogram", 00:05:42.364 "bdev_enable_histogram", 00:05:42.364 "bdev_set_qos_limit", 00:05:42.364 "bdev_set_qd_sampling_period", 00:05:42.364 "bdev_get_bdevs", 00:05:42.364 "bdev_reset_iostat", 00:05:42.364 "bdev_get_iostat", 00:05:42.364 "bdev_examine", 00:05:42.364 "bdev_wait_for_examine", 00:05:42.364 "bdev_set_options", 00:05:42.364 "scsi_get_devices", 00:05:42.364 "thread_set_cpumask", 00:05:42.364 "framework_get_scheduler", 00:05:42.364 "framework_set_scheduler", 00:05:42.364 "framework_get_reactors", 00:05:42.364 "thread_get_io_channels", 00:05:42.364 "thread_get_pollers", 00:05:42.364 "thread_get_stats", 00:05:42.364 "framework_monitor_context_switch", 00:05:42.364 "spdk_kill_instance", 00:05:42.364 "log_enable_timestamps", 00:05:42.364 "log_get_flags", 00:05:42.364 "log_clear_flag", 00:05:42.364 "log_set_flag", 00:05:42.364 "log_get_level", 00:05:42.364 "log_set_level", 00:05:42.364 "log_get_print_level", 00:05:42.364 "log_set_print_level", 00:05:42.364 "framework_enable_cpumask_locks", 00:05:42.364 "framework_disable_cpumask_locks", 00:05:42.364 "framework_wait_init", 00:05:42.364 "framework_start_init", 00:05:42.364 "virtio_blk_create_transport", 00:05:42.364 "virtio_blk_get_transports", 00:05:42.364 "vhost_controller_set_coalescing", 00:05:42.364 "vhost_get_controllers", 00:05:42.364 "vhost_delete_controller", 00:05:42.364 "vhost_create_blk_controller", 00:05:42.364 "vhost_scsi_controller_remove_target", 00:05:42.364 "vhost_scsi_controller_add_target", 00:05:42.364 "vhost_start_scsi_controller", 00:05:42.364 "vhost_create_scsi_controller", 00:05:42.365 "ublk_recover_disk", 00:05:42.365 "ublk_get_disks", 00:05:42.365 "ublk_stop_disk", 00:05:42.365 "ublk_start_disk", 00:05:42.365 "ublk_destroy_target", 00:05:42.365 "ublk_create_target", 00:05:42.365 "nbd_get_disks", 00:05:42.365 "nbd_stop_disk", 00:05:42.365 "nbd_start_disk", 00:05:42.365 "env_dpdk_get_mem_stats", 00:05:42.365 "nvmf_subsystem_get_listeners", 00:05:42.365 "nvmf_subsystem_get_qpairs", 00:05:42.365 "nvmf_subsystem_get_controllers", 00:05:42.365 "nvmf_get_stats", 00:05:42.365 "nvmf_get_transports", 00:05:42.365 "nvmf_create_transport", 00:05:42.365 "nvmf_get_targets", 00:05:42.365 "nvmf_delete_target", 00:05:42.365 "nvmf_create_target", 00:05:42.365 "nvmf_subsystem_allow_any_host", 00:05:42.365 "nvmf_subsystem_remove_host", 00:05:42.365 "nvmf_subsystem_add_host", 00:05:42.365 "nvmf_subsystem_remove_ns", 00:05:42.365 "nvmf_subsystem_add_ns", 00:05:42.365 "nvmf_subsystem_listener_set_ana_state", 00:05:42.365 "nvmf_discovery_get_referrals", 00:05:42.365 "nvmf_discovery_remove_referral", 00:05:42.365 "nvmf_discovery_add_referral", 00:05:42.365 "nvmf_subsystem_remove_listener", 00:05:42.365 "nvmf_subsystem_add_listener", 00:05:42.365 "nvmf_delete_subsystem", 00:05:42.365 "nvmf_create_subsystem", 00:05:42.365 "nvmf_get_subsystems", 00:05:42.365 "nvmf_set_crdt", 00:05:42.365 "nvmf_set_config", 00:05:42.365 "nvmf_set_max_subsystems", 00:05:42.365 "iscsi_set_options", 00:05:42.365 "iscsi_get_auth_groups", 00:05:42.365 "iscsi_auth_group_remove_secret", 00:05:42.365 "iscsi_auth_group_add_secret", 00:05:42.365 "iscsi_delete_auth_group", 00:05:42.365 "iscsi_create_auth_group", 00:05:42.365 "iscsi_set_discovery_auth", 00:05:42.365 "iscsi_get_options", 00:05:42.365 "iscsi_target_node_request_logout", 00:05:42.365 "iscsi_target_node_set_redirect", 00:05:42.365 "iscsi_target_node_set_auth", 00:05:42.365 "iscsi_target_node_add_lun", 00:05:42.365 "iscsi_get_connections", 00:05:42.365 "iscsi_portal_group_set_auth", 00:05:42.365 "iscsi_start_portal_group", 00:05:42.365 "iscsi_delete_portal_group", 00:05:42.365 "iscsi_create_portal_group", 00:05:42.365 "iscsi_get_portal_groups", 00:05:42.365 "iscsi_delete_target_node", 00:05:42.365 "iscsi_target_node_remove_pg_ig_maps", 00:05:42.365 "iscsi_target_node_add_pg_ig_maps", 00:05:42.365 "iscsi_create_target_node", 00:05:42.365 "iscsi_get_target_nodes", 00:05:42.365 "iscsi_delete_initiator_group", 00:05:42.365 "iscsi_initiator_group_remove_initiators", 00:05:42.365 "iscsi_initiator_group_add_initiators", 00:05:42.365 "iscsi_create_initiator_group", 00:05:42.365 "iscsi_get_initiator_groups", 00:05:42.365 "vfu_virtio_create_scsi_endpoint", 00:05:42.365 "vfu_virtio_scsi_remove_target", 00:05:42.365 "vfu_virtio_scsi_add_target", 00:05:42.365 "vfu_virtio_create_blk_endpoint", 00:05:42.365 "vfu_virtio_delete_endpoint", 00:05:42.365 "iaa_scan_accel_module", 00:05:42.365 "dsa_scan_accel_module", 00:05:42.365 "ioat_scan_accel_module", 00:05:42.365 "accel_error_inject_error", 00:05:42.365 "bdev_iscsi_delete", 00:05:42.365 "bdev_iscsi_create", 00:05:42.365 "bdev_iscsi_set_options", 00:05:42.365 "bdev_virtio_attach_controller", 00:05:42.365 "bdev_virtio_scsi_get_devices", 00:05:42.365 "bdev_virtio_detach_controller", 00:05:42.365 "bdev_virtio_blk_set_hotplug", 00:05:42.365 "bdev_ftl_set_property", 00:05:42.365 "bdev_ftl_get_properties", 00:05:42.365 "bdev_ftl_get_stats", 00:05:42.365 "bdev_ftl_unmap", 00:05:42.365 "bdev_ftl_unload", 00:05:42.365 "bdev_ftl_delete", 00:05:42.365 "bdev_ftl_load", 00:05:42.365 "bdev_ftl_create", 00:05:42.365 "bdev_aio_delete", 00:05:42.365 "bdev_aio_rescan", 00:05:42.365 "bdev_aio_create", 00:05:42.365 "blobfs_create", 00:05:42.365 "blobfs_detect", 00:05:42.365 "blobfs_set_cache_size", 00:05:42.365 "bdev_zone_block_delete", 00:05:42.365 "bdev_zone_block_create", 00:05:42.365 "bdev_delay_delete", 00:05:42.365 "bdev_delay_create", 00:05:42.365 "bdev_delay_update_latency", 00:05:42.365 "bdev_split_delete", 00:05:42.365 "bdev_split_create", 00:05:42.365 "bdev_error_inject_error", 00:05:42.365 "bdev_error_delete", 00:05:42.365 "bdev_error_create", 00:05:42.365 "bdev_raid_set_options", 00:05:42.365 "bdev_raid_remove_base_bdev", 00:05:42.365 "bdev_raid_add_base_bdev", 00:05:42.365 "bdev_raid_delete", 00:05:42.365 "bdev_raid_create", 00:05:42.365 "bdev_raid_get_bdevs", 00:05:42.365 "bdev_lvol_grow_lvstore", 00:05:42.365 "bdev_lvol_get_lvols", 00:05:42.365 "bdev_lvol_get_lvstores", 00:05:42.365 "bdev_lvol_delete", 00:05:42.365 "bdev_lvol_set_read_only", 00:05:42.365 "bdev_lvol_resize", 00:05:42.365 "bdev_lvol_decouple_parent", 00:05:42.365 "bdev_lvol_inflate", 00:05:42.365 "bdev_lvol_rename", 00:05:42.365 "bdev_lvol_clone_bdev", 00:05:42.365 "bdev_lvol_clone", 00:05:42.365 "bdev_lvol_snapshot", 00:05:42.365 "bdev_lvol_create", 00:05:42.365 "bdev_lvol_delete_lvstore", 00:05:42.365 "bdev_lvol_rename_lvstore", 00:05:42.365 "bdev_lvol_create_lvstore", 00:05:42.365 "bdev_passthru_delete", 00:05:42.365 "bdev_passthru_create", 00:05:42.365 "bdev_nvme_cuse_unregister", 00:05:42.365 "bdev_nvme_cuse_register", 00:05:42.365 "bdev_opal_new_user", 00:05:42.365 "bdev_opal_set_lock_state", 00:05:42.365 "bdev_opal_delete", 00:05:42.365 "bdev_opal_get_info", 00:05:42.365 "bdev_opal_create", 00:05:42.365 "bdev_nvme_opal_revert", 00:05:42.365 "bdev_nvme_opal_init", 00:05:42.365 "bdev_nvme_send_cmd", 00:05:42.365 "bdev_nvme_get_path_iostat", 00:05:42.365 "bdev_nvme_get_mdns_discovery_info", 00:05:42.365 "bdev_nvme_stop_mdns_discovery", 00:05:42.365 "bdev_nvme_start_mdns_discovery", 00:05:42.365 "bdev_nvme_set_multipath_policy", 00:05:42.365 "bdev_nvme_set_preferred_path", 00:05:42.365 "bdev_nvme_get_io_paths", 00:05:42.365 "bdev_nvme_remove_error_injection", 00:05:42.365 "bdev_nvme_add_error_injection", 00:05:42.365 "bdev_nvme_get_discovery_info", 00:05:42.365 "bdev_nvme_stop_discovery", 00:05:42.365 "bdev_nvme_start_discovery", 00:05:42.365 "bdev_nvme_get_controller_health_info", 00:05:42.365 "bdev_nvme_disable_controller", 00:05:42.365 "bdev_nvme_enable_controller", 00:05:42.365 "bdev_nvme_reset_controller", 00:05:42.365 "bdev_nvme_get_transport_statistics", 00:05:42.365 "bdev_nvme_apply_firmware", 00:05:42.365 "bdev_nvme_detach_controller", 00:05:42.365 "bdev_nvme_get_controllers", 00:05:42.365 "bdev_nvme_attach_controller", 00:05:42.365 "bdev_nvme_set_hotplug", 00:05:42.365 "bdev_nvme_set_options", 00:05:42.365 "bdev_null_resize", 00:05:42.365 "bdev_null_delete", 00:05:42.365 "bdev_null_create", 00:05:42.365 "bdev_malloc_delete", 00:05:42.365 "bdev_malloc_create" 00:05:42.365 ] 00:05:42.365 02:53:37 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:42.365 02:53:37 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:42.365 02:53:37 -- common/autotest_common.sh@10 -- # set +x 00:05:42.365 02:53:37 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:42.365 02:53:37 -- spdkcli/tcp.sh@38 -- # killprocess 645437 00:05:42.365 02:53:37 -- common/autotest_common.sh@926 -- # '[' -z 645437 ']' 00:05:42.365 02:53:37 -- common/autotest_common.sh@930 -- # kill -0 645437 00:05:42.365 02:53:37 -- common/autotest_common.sh@931 -- # uname 00:05:42.365 02:53:37 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:42.365 02:53:37 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 645437 00:05:42.623 02:53:37 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:42.623 02:53:37 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:42.623 02:53:37 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 645437' 00:05:42.623 killing process with pid 645437 00:05:42.623 02:53:37 -- common/autotest_common.sh@945 -- # kill 645437 00:05:42.623 02:53:37 -- common/autotest_common.sh@950 -- # wait 645437 00:05:42.881 00:05:42.881 real 0m1.493s 00:05:42.881 user 0m2.798s 00:05:42.881 sys 0m0.485s 00:05:42.881 02:53:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.881 02:53:37 -- common/autotest_common.sh@10 -- # set +x 00:05:42.881 ************************************ 00:05:42.881 END TEST spdkcli_tcp 00:05:42.881 ************************************ 00:05:42.881 02:53:37 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:42.881 02:53:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.881 02:53:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.881 02:53:37 -- common/autotest_common.sh@10 -- # set +x 00:05:42.881 ************************************ 00:05:42.881 START TEST dpdk_mem_utility 00:05:42.881 ************************************ 00:05:42.881 02:53:37 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:42.881 * Looking for test storage... 00:05:42.881 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:42.881 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:42.881 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=645812 00:05:42.881 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 645812 00:05:42.881 02:53:38 -- common/autotest_common.sh@819 -- # '[' -z 645812 ']' 00:05:42.881 02:53:38 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:42.881 02:53:38 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:42.881 02:53:38 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:42.881 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:42.882 02:53:38 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:42.882 02:53:38 -- common/autotest_common.sh@10 -- # set +x 00:05:42.882 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.882 [2024-07-14 02:53:38.085855] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:42.882 [2024-07-14 02:53:38.085945] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645812 ] 00:05:42.882 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.139 [2024-07-14 02:53:38.153730] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.139 [2024-07-14 02:53:38.190978] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:43.139 [2024-07-14 02:53:38.191088] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.706 02:53:38 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:43.706 02:53:38 -- common/autotest_common.sh@852 -- # return 0 00:05:43.706 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:43.706 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:43.706 02:53:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:43.706 02:53:38 -- common/autotest_common.sh@10 -- # set +x 00:05:43.706 { 00:05:43.706 "filename": "/tmp/spdk_mem_dump.txt" 00:05:43.706 } 00:05:43.706 02:53:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:43.706 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:43.706 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:43.706 1 heaps totaling size 814.000000 MiB 00:05:43.706 size: 814.000000 MiB heap id: 0 00:05:43.706 end heaps---------- 00:05:43.706 8 mempools totaling size 598.116089 MiB 00:05:43.706 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:43.706 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:43.706 size: 84.521057 MiB name: bdev_io_645812 00:05:43.706 size: 51.011292 MiB name: evtpool_645812 00:05:43.706 size: 50.003479 MiB name: msgpool_645812 00:05:43.706 size: 21.763794 MiB name: PDU_Pool 00:05:43.706 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:43.706 size: 0.026123 MiB name: Session_Pool 00:05:43.706 end mempools------- 00:05:43.706 6 memzones totaling size 4.142822 MiB 00:05:43.706 size: 1.000366 MiB name: RG_ring_0_645812 00:05:43.706 size: 1.000366 MiB name: RG_ring_1_645812 00:05:43.706 size: 1.000366 MiB name: RG_ring_4_645812 00:05:43.706 size: 1.000366 MiB name: RG_ring_5_645812 00:05:43.706 size: 0.125366 MiB name: RG_ring_2_645812 00:05:43.706 size: 0.015991 MiB name: RG_ring_3_645812 00:05:43.706 end memzones------- 00:05:43.706 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:43.965 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:43.965 list of free elements. size: 12.519348 MiB 00:05:43.965 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:43.965 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:43.965 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:43.965 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:43.965 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:43.965 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:43.965 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:43.965 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:43.965 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:43.965 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:43.965 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:43.965 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:43.965 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:43.965 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:43.965 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:43.965 list of standard malloc elements. size: 199.218079 MiB 00:05:43.965 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:43.965 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:43.965 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:43.965 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:43.965 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:43.965 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:43.965 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:43.965 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:43.965 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:43.965 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:43.965 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:43.965 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:43.965 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:43.965 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:43.965 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:43.965 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:43.965 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:43.965 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:43.965 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:43.965 list of memzone associated elements. size: 602.262573 MiB 00:05:43.965 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:43.965 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:43.965 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:43.965 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:43.965 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:43.965 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_645812_0 00:05:43.965 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:43.965 associated memzone info: size: 48.002930 MiB name: MP_evtpool_645812_0 00:05:43.965 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:43.965 associated memzone info: size: 48.002930 MiB name: MP_msgpool_645812_0 00:05:43.965 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:43.965 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:43.965 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:43.965 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:43.965 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:43.965 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_645812 00:05:43.965 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:43.965 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_645812 00:05:43.965 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:43.965 associated memzone info: size: 1.007996 MiB name: MP_evtpool_645812 00:05:43.965 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:43.965 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:43.965 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:43.965 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:43.965 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:43.965 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:43.965 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:43.965 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:43.965 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:43.965 associated memzone info: size: 1.000366 MiB name: RG_ring_0_645812 00:05:43.965 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:43.965 associated memzone info: size: 1.000366 MiB name: RG_ring_1_645812 00:05:43.965 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:43.965 associated memzone info: size: 1.000366 MiB name: RG_ring_4_645812 00:05:43.965 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:43.965 associated memzone info: size: 1.000366 MiB name: RG_ring_5_645812 00:05:43.965 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:43.965 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_645812 00:05:43.965 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:43.965 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:43.965 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:43.965 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:43.965 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:43.965 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:43.965 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:43.965 associated memzone info: size: 0.125366 MiB name: RG_ring_2_645812 00:05:43.965 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:43.965 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:43.965 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:43.965 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:43.965 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:43.965 associated memzone info: size: 0.015991 MiB name: RG_ring_3_645812 00:05:43.965 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:43.965 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:43.965 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:43.965 associated memzone info: size: 0.000183 MiB name: MP_msgpool_645812 00:05:43.965 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:43.966 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_645812 00:05:43.966 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:43.966 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:43.966 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:43.966 02:53:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 645812 00:05:43.966 02:53:38 -- common/autotest_common.sh@926 -- # '[' -z 645812 ']' 00:05:43.966 02:53:38 -- common/autotest_common.sh@930 -- # kill -0 645812 00:05:43.966 02:53:38 -- common/autotest_common.sh@931 -- # uname 00:05:43.966 02:53:38 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:43.966 02:53:38 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 645812 00:05:43.966 02:53:39 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:43.966 02:53:39 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:43.966 02:53:39 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 645812' 00:05:43.966 killing process with pid 645812 00:05:43.966 02:53:39 -- common/autotest_common.sh@945 -- # kill 645812 00:05:43.966 02:53:39 -- common/autotest_common.sh@950 -- # wait 645812 00:05:44.225 00:05:44.225 real 0m1.337s 00:05:44.225 user 0m1.380s 00:05:44.225 sys 0m0.405s 00:05:44.225 02:53:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:44.225 02:53:39 -- common/autotest_common.sh@10 -- # set +x 00:05:44.225 ************************************ 00:05:44.225 END TEST dpdk_mem_utility 00:05:44.225 ************************************ 00:05:44.225 02:53:39 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:44.225 02:53:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:44.225 02:53:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:44.225 02:53:39 -- common/autotest_common.sh@10 -- # set +x 00:05:44.225 ************************************ 00:05:44.225 START TEST event 00:05:44.225 ************************************ 00:05:44.225 02:53:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:44.225 * Looking for test storage... 00:05:44.225 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:44.225 02:53:39 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:44.225 02:53:39 -- bdev/nbd_common.sh@6 -- # set -e 00:05:44.225 02:53:39 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:44.225 02:53:39 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:44.225 02:53:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:44.225 02:53:39 -- common/autotest_common.sh@10 -- # set +x 00:05:44.225 ************************************ 00:05:44.225 START TEST event_perf 00:05:44.225 ************************************ 00:05:44.225 02:53:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:44.484 Running I/O for 1 seconds...[2024-07-14 02:53:39.479880] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:44.484 [2024-07-14 02:53:39.479953] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646084 ] 00:05:44.484 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.484 [2024-07-14 02:53:39.543833] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:44.484 [2024-07-14 02:53:39.583170] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:44.484 [2024-07-14 02:53:39.583268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:44.484 [2024-07-14 02:53:39.583355] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:44.484 [2024-07-14 02:53:39.583357] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.421 Running I/O for 1 seconds... 00:05:45.421 lcore 0: 199706 00:05:45.421 lcore 1: 199706 00:05:45.421 lcore 2: 199706 00:05:45.421 lcore 3: 199705 00:05:45.421 done. 00:05:45.421 00:05:45.421 real 0m1.173s 00:05:45.421 user 0m4.090s 00:05:45.422 sys 0m0.082s 00:05:45.422 02:53:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:45.422 02:53:40 -- common/autotest_common.sh@10 -- # set +x 00:05:45.422 ************************************ 00:05:45.422 END TEST event_perf 00:05:45.422 ************************************ 00:05:45.681 02:53:40 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:45.681 02:53:40 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:45.681 02:53:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:45.681 02:53:40 -- common/autotest_common.sh@10 -- # set +x 00:05:45.681 ************************************ 00:05:45.681 START TEST event_reactor 00:05:45.681 ************************************ 00:05:45.681 02:53:40 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:45.682 [2024-07-14 02:53:40.704863] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:45.682 [2024-07-14 02:53:40.704997] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646218 ] 00:05:45.682 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.682 [2024-07-14 02:53:40.774469] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.682 [2024-07-14 02:53:40.809272] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.619 test_start 00:05:46.619 oneshot 00:05:46.619 tick 100 00:05:46.619 tick 100 00:05:46.619 tick 250 00:05:46.619 tick 100 00:05:46.619 tick 100 00:05:46.619 tick 250 00:05:46.619 tick 500 00:05:46.619 tick 100 00:05:46.619 tick 100 00:05:46.619 tick 100 00:05:46.619 tick 250 00:05:46.619 tick 100 00:05:46.619 tick 100 00:05:46.619 test_end 00:05:46.619 00:05:46.619 real 0m1.176s 00:05:46.619 user 0m1.080s 00:05:46.619 sys 0m0.092s 00:05:46.619 02:53:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:46.619 02:53:41 -- common/autotest_common.sh@10 -- # set +x 00:05:46.619 ************************************ 00:05:46.619 END TEST event_reactor 00:05:46.619 ************************************ 00:05:46.879 02:53:41 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:46.879 02:53:41 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:46.879 02:53:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:46.879 02:53:41 -- common/autotest_common.sh@10 -- # set +x 00:05:46.879 ************************************ 00:05:46.879 START TEST event_reactor_perf 00:05:46.879 ************************************ 00:05:46.879 02:53:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:46.879 [2024-07-14 02:53:41.920244] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:46.879 [2024-07-14 02:53:41.920331] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646481 ] 00:05:46.879 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.879 [2024-07-14 02:53:41.987910] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.879 [2024-07-14 02:53:42.022699] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.260 test_start 00:05:48.260 test_end 00:05:48.261 Performance: 954136 events per second 00:05:48.261 00:05:48.261 real 0m1.173s 00:05:48.261 user 0m1.087s 00:05:48.261 sys 0m0.082s 00:05:48.261 02:53:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.261 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.261 ************************************ 00:05:48.261 END TEST event_reactor_perf 00:05:48.261 ************************************ 00:05:48.261 02:53:43 -- event/event.sh@49 -- # uname -s 00:05:48.261 02:53:43 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:48.261 02:53:43 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:48.261 02:53:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.261 02:53:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.261 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.261 ************************************ 00:05:48.261 START TEST event_scheduler 00:05:48.261 ************************************ 00:05:48.261 02:53:43 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:48.261 * Looking for test storage... 00:05:48.261 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@35 -- # scheduler_pid=646796 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@37 -- # waitforlisten 646796 00:05:48.261 02:53:43 -- common/autotest_common.sh@819 -- # '[' -z 646796 ']' 00:05:48.261 02:53:43 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.261 02:53:43 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:48.261 02:53:43 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.261 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.261 02:53:43 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:48.261 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.261 [2024-07-14 02:53:43.242972] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:48.261 [2024-07-14 02:53:43.243067] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646796 ] 00:05:48.261 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.261 [2024-07-14 02:53:43.308060] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:48.261 [2024-07-14 02:53:43.347519] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.261 [2024-07-14 02:53:43.347604] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:48.261 [2024-07-14 02:53:43.347688] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:48.261 [2024-07-14 02:53:43.347690] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:48.261 02:53:43 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:48.261 02:53:43 -- common/autotest_common.sh@852 -- # return 0 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:48.261 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.261 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.261 POWER: Env isn't set yet! 00:05:48.261 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:48.261 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:48.261 POWER: Cannot set governor of lcore 0 to userspace 00:05:48.261 POWER: Attempting to initialise PSTAT power management... 00:05:48.261 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:48.261 POWER: Initialized successfully for lcore 0 power management 00:05:48.261 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:48.261 POWER: Initialized successfully for lcore 1 power management 00:05:48.261 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:48.261 POWER: Initialized successfully for lcore 2 power management 00:05:48.261 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:48.261 POWER: Initialized successfully for lcore 3 power management 00:05:48.261 [2024-07-14 02:53:43.438612] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:48.261 [2024-07-14 02:53:43.438628] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:48.261 [2024-07-14 02:53:43.438639] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:48.261 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:48.261 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.261 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.261 [2024-07-14 02:53:43.500815] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:48.261 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:48.261 02:53:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.261 02:53:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.261 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.261 ************************************ 00:05:48.261 START TEST scheduler_create_thread 00:05:48.261 ************************************ 00:05:48.261 02:53:43 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:48.261 02:53:43 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:48.261 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.261 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.519 2 00:05:48.519 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.519 02:53:43 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:48.519 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.519 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.519 3 00:05:48.519 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.519 02:53:43 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:48.519 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.519 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.519 4 00:05:48.519 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.519 02:53:43 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:48.519 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.519 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.519 5 00:05:48.519 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.519 02:53:43 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:48.519 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.519 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.519 6 00:05:48.520 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.520 02:53:43 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:48.520 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.520 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.520 7 00:05:48.520 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.520 02:53:43 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:48.520 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.520 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.520 8 00:05:48.520 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.520 02:53:43 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:48.520 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.520 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.520 9 00:05:48.520 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.520 02:53:43 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:48.520 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.520 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.520 10 00:05:48.520 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.520 02:53:43 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:48.520 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.520 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:48.520 02:53:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.520 02:53:43 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:48.520 02:53:43 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:48.520 02:53:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.520 02:53:43 -- common/autotest_common.sh@10 -- # set +x 00:05:49.456 02:53:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.456 02:53:44 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:49.456 02:53:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.456 02:53:44 -- common/autotest_common.sh@10 -- # set +x 00:05:50.835 02:53:45 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.835 02:53:45 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:50.835 02:53:45 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:50.835 02:53:45 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.835 02:53:45 -- common/autotest_common.sh@10 -- # set +x 00:05:51.774 02:53:46 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:51.774 00:05:51.774 real 0m3.382s 00:05:51.774 user 0m0.021s 00:05:51.774 sys 0m0.008s 00:05:51.774 02:53:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.774 02:53:46 -- common/autotest_common.sh@10 -- # set +x 00:05:51.774 ************************************ 00:05:51.774 END TEST scheduler_create_thread 00:05:51.774 ************************************ 00:05:51.774 02:53:46 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:51.774 02:53:46 -- scheduler/scheduler.sh@46 -- # killprocess 646796 00:05:51.774 02:53:46 -- common/autotest_common.sh@926 -- # '[' -z 646796 ']' 00:05:51.774 02:53:46 -- common/autotest_common.sh@930 -- # kill -0 646796 00:05:51.774 02:53:46 -- common/autotest_common.sh@931 -- # uname 00:05:51.774 02:53:46 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:51.774 02:53:46 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 646796 00:05:51.774 02:53:46 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:51.774 02:53:46 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:51.774 02:53:46 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 646796' 00:05:51.774 killing process with pid 646796 00:05:51.774 02:53:46 -- common/autotest_common.sh@945 -- # kill 646796 00:05:51.774 02:53:46 -- common/autotest_common.sh@950 -- # wait 646796 00:05:52.034 [2024-07-14 02:53:47.272497] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:52.294 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:52.294 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:52.294 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:52.294 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:52.294 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:52.294 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:52.294 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:52.294 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:52.294 00:05:52.294 real 0m4.364s 00:05:52.294 user 0m7.718s 00:05:52.294 sys 0m0.343s 00:05:52.294 02:53:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:52.294 02:53:47 -- common/autotest_common.sh@10 -- # set +x 00:05:52.294 ************************************ 00:05:52.294 END TEST event_scheduler 00:05:52.294 ************************************ 00:05:52.294 02:53:47 -- event/event.sh@51 -- # modprobe -n nbd 00:05:52.294 02:53:47 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:52.294 02:53:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:52.294 02:53:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:52.294 02:53:47 -- common/autotest_common.sh@10 -- # set +x 00:05:52.294 ************************************ 00:05:52.294 START TEST app_repeat 00:05:52.294 ************************************ 00:05:52.294 02:53:47 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:05:52.294 02:53:47 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:52.294 02:53:47 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:52.294 02:53:47 -- event/event.sh@13 -- # local nbd_list 00:05:52.294 02:53:47 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:52.294 02:53:47 -- event/event.sh@14 -- # local bdev_list 00:05:52.294 02:53:47 -- event/event.sh@15 -- # local repeat_times=4 00:05:52.294 02:53:47 -- event/event.sh@17 -- # modprobe nbd 00:05:52.294 02:53:47 -- event/event.sh@19 -- # repeat_pid=647652 00:05:52.294 02:53:47 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:52.294 02:53:47 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:52.555 02:53:47 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 647652' 00:05:52.555 Process app_repeat pid: 647652 00:05:52.555 02:53:47 -- event/event.sh@23 -- # for i in {0..2} 00:05:52.555 02:53:47 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:52.555 spdk_app_start Round 0 00:05:52.555 02:53:47 -- event/event.sh@25 -- # waitforlisten 647652 /var/tmp/spdk-nbd.sock 00:05:52.555 02:53:47 -- common/autotest_common.sh@819 -- # '[' -z 647652 ']' 00:05:52.555 02:53:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:52.555 02:53:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:52.555 02:53:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:52.555 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:52.555 02:53:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:52.555 02:53:47 -- common/autotest_common.sh@10 -- # set +x 00:05:52.555 [2024-07-14 02:53:47.563124] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:52.555 [2024-07-14 02:53:47.563216] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647652 ] 00:05:52.555 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.555 [2024-07-14 02:53:47.630999] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:52.555 [2024-07-14 02:53:47.666255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:52.555 [2024-07-14 02:53:47.666258] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.124 02:53:48 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:53.124 02:53:48 -- common/autotest_common.sh@852 -- # return 0 00:05:53.124 02:53:48 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:53.383 Malloc0 00:05:53.383 02:53:48 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:53.642 Malloc1 00:05:53.642 02:53:48 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:53.642 02:53:48 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:53.643 02:53:48 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:53.643 02:53:48 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:53.643 02:53:48 -- bdev/nbd_common.sh@12 -- # local i 00:05:53.643 02:53:48 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:53.643 02:53:48 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:53.643 02:53:48 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:53.902 /dev/nbd0 00:05:53.902 02:53:48 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:53.902 02:53:48 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:53.902 02:53:48 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:53.902 02:53:48 -- common/autotest_common.sh@857 -- # local i 00:05:53.902 02:53:48 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:53.902 02:53:48 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:53.902 02:53:48 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:53.902 02:53:48 -- common/autotest_common.sh@861 -- # break 00:05:53.902 02:53:48 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:53.902 02:53:48 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:53.902 02:53:48 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:53.902 1+0 records in 00:05:53.902 1+0 records out 00:05:53.902 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267551 s, 15.3 MB/s 00:05:53.902 02:53:48 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:53.902 02:53:48 -- common/autotest_common.sh@874 -- # size=4096 00:05:53.902 02:53:48 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:53.902 02:53:48 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:53.902 02:53:48 -- common/autotest_common.sh@877 -- # return 0 00:05:53.902 02:53:48 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:53.902 02:53:48 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:53.903 02:53:48 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:53.903 /dev/nbd1 00:05:53.903 02:53:49 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:53.903 02:53:49 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:53.903 02:53:49 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:53.903 02:53:49 -- common/autotest_common.sh@857 -- # local i 00:05:53.903 02:53:49 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:53.903 02:53:49 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:53.903 02:53:49 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:53.903 02:53:49 -- common/autotest_common.sh@861 -- # break 00:05:53.903 02:53:49 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:53.903 02:53:49 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:53.903 02:53:49 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:53.903 1+0 records in 00:05:53.903 1+0 records out 00:05:53.903 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249229 s, 16.4 MB/s 00:05:53.903 02:53:49 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:54.162 02:53:49 -- common/autotest_common.sh@874 -- # size=4096 00:05:54.162 02:53:49 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:54.162 02:53:49 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:54.162 02:53:49 -- common/autotest_common.sh@877 -- # return 0 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:54.162 { 00:05:54.162 "nbd_device": "/dev/nbd0", 00:05:54.162 "bdev_name": "Malloc0" 00:05:54.162 }, 00:05:54.162 { 00:05:54.162 "nbd_device": "/dev/nbd1", 00:05:54.162 "bdev_name": "Malloc1" 00:05:54.162 } 00:05:54.162 ]' 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:54.162 { 00:05:54.162 "nbd_device": "/dev/nbd0", 00:05:54.162 "bdev_name": "Malloc0" 00:05:54.162 }, 00:05:54.162 { 00:05:54.162 "nbd_device": "/dev/nbd1", 00:05:54.162 "bdev_name": "Malloc1" 00:05:54.162 } 00:05:54.162 ]' 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:54.162 /dev/nbd1' 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:54.162 /dev/nbd1' 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@65 -- # count=2 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@95 -- # count=2 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:54.162 256+0 records in 00:05:54.162 256+0 records out 00:05:54.162 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112109 s, 93.5 MB/s 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:54.162 02:53:49 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:54.422 256+0 records in 00:05:54.422 256+0 records out 00:05:54.422 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0204244 s, 51.3 MB/s 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:54.422 256+0 records in 00:05:54.422 256+0 records out 00:05:54.422 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0217456 s, 48.2 MB/s 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@51 -- # local i 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@41 -- # break 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@45 -- # return 0 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:54.422 02:53:49 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:54.680 02:53:49 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:54.680 02:53:49 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@41 -- # break 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@45 -- # return 0 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:54.681 02:53:49 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@65 -- # true 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@65 -- # count=0 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@104 -- # count=0 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:54.940 02:53:50 -- bdev/nbd_common.sh@109 -- # return 0 00:05:54.940 02:53:50 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:55.200 02:53:50 -- event/event.sh@35 -- # sleep 3 00:05:55.459 [2024-07-14 02:53:50.483522] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:55.459 [2024-07-14 02:53:50.515719] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:55.459 [2024-07-14 02:53:50.515721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.459 [2024-07-14 02:53:50.556006] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:55.459 [2024-07-14 02:53:50.556047] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:58.750 02:53:53 -- event/event.sh@23 -- # for i in {0..2} 00:05:58.750 02:53:53 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:58.750 spdk_app_start Round 1 00:05:58.750 02:53:53 -- event/event.sh@25 -- # waitforlisten 647652 /var/tmp/spdk-nbd.sock 00:05:58.750 02:53:53 -- common/autotest_common.sh@819 -- # '[' -z 647652 ']' 00:05:58.750 02:53:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:58.750 02:53:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:58.750 02:53:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:58.750 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:58.750 02:53:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:58.750 02:53:53 -- common/autotest_common.sh@10 -- # set +x 00:05:58.750 02:53:53 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:58.750 02:53:53 -- common/autotest_common.sh@852 -- # return 0 00:05:58.750 02:53:53 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:58.750 Malloc0 00:05:58.750 02:53:53 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:58.750 Malloc1 00:05:58.750 02:53:53 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@12 -- # local i 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:58.750 02:53:53 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:59.009 /dev/nbd0 00:05:59.009 02:53:54 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:59.009 02:53:54 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:59.009 02:53:54 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:59.009 02:53:54 -- common/autotest_common.sh@857 -- # local i 00:05:59.009 02:53:54 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:59.009 02:53:54 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:59.009 02:53:54 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:59.009 02:53:54 -- common/autotest_common.sh@861 -- # break 00:05:59.009 02:53:54 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:59.009 02:53:54 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:59.009 02:53:54 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:59.009 1+0 records in 00:05:59.009 1+0 records out 00:05:59.009 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256695 s, 16.0 MB/s 00:05:59.009 02:53:54 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.009 02:53:54 -- common/autotest_common.sh@874 -- # size=4096 00:05:59.009 02:53:54 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.009 02:53:54 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:59.009 02:53:54 -- common/autotest_common.sh@877 -- # return 0 00:05:59.009 02:53:54 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:59.009 02:53:54 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:59.009 02:53:54 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:59.009 /dev/nbd1 00:05:59.009 02:53:54 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:59.268 02:53:54 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:59.268 02:53:54 -- common/autotest_common.sh@857 -- # local i 00:05:59.268 02:53:54 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:59.268 02:53:54 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:59.268 02:53:54 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:59.268 02:53:54 -- common/autotest_common.sh@861 -- # break 00:05:59.268 02:53:54 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:59.268 02:53:54 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:59.268 02:53:54 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:59.268 1+0 records in 00:05:59.268 1+0 records out 00:05:59.268 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265068 s, 15.5 MB/s 00:05:59.268 02:53:54 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.268 02:53:54 -- common/autotest_common.sh@874 -- # size=4096 00:05:59.268 02:53:54 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.268 02:53:54 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:59.268 02:53:54 -- common/autotest_common.sh@877 -- # return 0 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:59.268 { 00:05:59.268 "nbd_device": "/dev/nbd0", 00:05:59.268 "bdev_name": "Malloc0" 00:05:59.268 }, 00:05:59.268 { 00:05:59.268 "nbd_device": "/dev/nbd1", 00:05:59.268 "bdev_name": "Malloc1" 00:05:59.268 } 00:05:59.268 ]' 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:59.268 { 00:05:59.268 "nbd_device": "/dev/nbd0", 00:05:59.268 "bdev_name": "Malloc0" 00:05:59.268 }, 00:05:59.268 { 00:05:59.268 "nbd_device": "/dev/nbd1", 00:05:59.268 "bdev_name": "Malloc1" 00:05:59.268 } 00:05:59.268 ]' 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:59.268 /dev/nbd1' 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:59.268 /dev/nbd1' 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@65 -- # count=2 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@95 -- # count=2 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:59.268 256+0 records in 00:05:59.268 256+0 records out 00:05:59.268 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113639 s, 92.3 MB/s 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:59.268 02:53:54 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:59.527 256+0 records in 00:05:59.527 256+0 records out 00:05:59.527 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0203181 s, 51.6 MB/s 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:59.527 256+0 records in 00:05:59.527 256+0 records out 00:05:59.527 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.021783 s, 48.1 MB/s 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@51 -- # local i 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:59.527 02:53:54 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@41 -- # break 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@45 -- # return 0 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@41 -- # break 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@45 -- # return 0 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.787 02:53:54 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@65 -- # true 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@65 -- # count=0 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@104 -- # count=0 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:00.046 02:53:55 -- bdev/nbd_common.sh@109 -- # return 0 00:06:00.046 02:53:55 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:00.305 02:53:55 -- event/event.sh@35 -- # sleep 3 00:06:00.305 [2024-07-14 02:53:55.544548] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:00.564 [2024-07-14 02:53:55.577110] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:00.564 [2024-07-14 02:53:55.577113] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.564 [2024-07-14 02:53:55.617468] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:00.564 [2024-07-14 02:53:55.617510] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:03.850 02:53:58 -- event/event.sh@23 -- # for i in {0..2} 00:06:03.850 02:53:58 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:03.850 spdk_app_start Round 2 00:06:03.850 02:53:58 -- event/event.sh@25 -- # waitforlisten 647652 /var/tmp/spdk-nbd.sock 00:06:03.850 02:53:58 -- common/autotest_common.sh@819 -- # '[' -z 647652 ']' 00:06:03.850 02:53:58 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:03.850 02:53:58 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:03.850 02:53:58 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:03.850 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:03.850 02:53:58 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:03.850 02:53:58 -- common/autotest_common.sh@10 -- # set +x 00:06:03.850 02:53:58 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:03.850 02:53:58 -- common/autotest_common.sh@852 -- # return 0 00:06:03.850 02:53:58 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:03.850 Malloc0 00:06:03.850 02:53:58 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:03.850 Malloc1 00:06:03.850 02:53:58 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@12 -- # local i 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.850 02:53:58 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:03.850 /dev/nbd0 00:06:03.850 02:53:59 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:03.850 02:53:59 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:03.850 02:53:59 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:03.850 02:53:59 -- common/autotest_common.sh@857 -- # local i 00:06:03.850 02:53:59 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:03.850 02:53:59 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:03.850 02:53:59 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:03.850 02:53:59 -- common/autotest_common.sh@861 -- # break 00:06:03.850 02:53:59 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:03.850 02:53:59 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:03.850 02:53:59 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:03.850 1+0 records in 00:06:03.850 1+0 records out 00:06:03.850 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257996 s, 15.9 MB/s 00:06:03.850 02:53:59 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.850 02:53:59 -- common/autotest_common.sh@874 -- # size=4096 00:06:03.850 02:53:59 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.850 02:53:59 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:03.850 02:53:59 -- common/autotest_common.sh@877 -- # return 0 00:06:03.850 02:53:59 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:03.850 02:53:59 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.850 02:53:59 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:04.109 /dev/nbd1 00:06:04.109 02:53:59 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:04.109 02:53:59 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:04.109 02:53:59 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:04.109 02:53:59 -- common/autotest_common.sh@857 -- # local i 00:06:04.109 02:53:59 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:04.109 02:53:59 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:04.109 02:53:59 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:04.109 02:53:59 -- common/autotest_common.sh@861 -- # break 00:06:04.109 02:53:59 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:04.110 02:53:59 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:04.110 02:53:59 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:04.110 1+0 records in 00:06:04.110 1+0 records out 00:06:04.110 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256062 s, 16.0 MB/s 00:06:04.110 02:53:59 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.110 02:53:59 -- common/autotest_common.sh@874 -- # size=4096 00:06:04.110 02:53:59 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.110 02:53:59 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:04.110 02:53:59 -- common/autotest_common.sh@877 -- # return 0 00:06:04.110 02:53:59 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:04.110 02:53:59 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.110 02:53:59 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:04.110 02:53:59 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.110 02:53:59 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:04.368 { 00:06:04.368 "nbd_device": "/dev/nbd0", 00:06:04.368 "bdev_name": "Malloc0" 00:06:04.368 }, 00:06:04.368 { 00:06:04.368 "nbd_device": "/dev/nbd1", 00:06:04.368 "bdev_name": "Malloc1" 00:06:04.368 } 00:06:04.368 ]' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:04.368 { 00:06:04.368 "nbd_device": "/dev/nbd0", 00:06:04.368 "bdev_name": "Malloc0" 00:06:04.368 }, 00:06:04.368 { 00:06:04.368 "nbd_device": "/dev/nbd1", 00:06:04.368 "bdev_name": "Malloc1" 00:06:04.368 } 00:06:04.368 ]' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:04.368 /dev/nbd1' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:04.368 /dev/nbd1' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@65 -- # count=2 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@95 -- # count=2 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:04.368 256+0 records in 00:06:04.368 256+0 records out 00:06:04.368 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114213 s, 91.8 MB/s 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:04.368 256+0 records in 00:06:04.368 256+0 records out 00:06:04.368 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201932 s, 51.9 MB/s 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:04.368 256+0 records in 00:06:04.368 256+0 records out 00:06:04.368 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212823 s, 49.3 MB/s 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.368 02:53:59 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.369 02:53:59 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:04.369 02:53:59 -- bdev/nbd_common.sh@51 -- # local i 00:06:04.369 02:53:59 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:04.369 02:53:59 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@41 -- # break 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@45 -- # return 0 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:04.627 02:53:59 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@41 -- # break 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@45 -- # return 0 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.886 02:53:59 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@65 -- # true 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@65 -- # count=0 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@104 -- # count=0 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:05.145 02:54:00 -- bdev/nbd_common.sh@109 -- # return 0 00:06:05.145 02:54:00 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:05.145 02:54:00 -- event/event.sh@35 -- # sleep 3 00:06:05.404 [2024-07-14 02:54:00.570661] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:05.404 [2024-07-14 02:54:00.603253] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:05.404 [2024-07-14 02:54:00.603256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.404 [2024-07-14 02:54:00.643650] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:05.404 [2024-07-14 02:54:00.643691] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:08.694 02:54:03 -- event/event.sh@38 -- # waitforlisten 647652 /var/tmp/spdk-nbd.sock 00:06:08.694 02:54:03 -- common/autotest_common.sh@819 -- # '[' -z 647652 ']' 00:06:08.694 02:54:03 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:08.694 02:54:03 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:08.694 02:54:03 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:08.694 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:08.694 02:54:03 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:08.694 02:54:03 -- common/autotest_common.sh@10 -- # set +x 00:06:08.694 02:54:03 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:08.694 02:54:03 -- common/autotest_common.sh@852 -- # return 0 00:06:08.694 02:54:03 -- event/event.sh@39 -- # killprocess 647652 00:06:08.694 02:54:03 -- common/autotest_common.sh@926 -- # '[' -z 647652 ']' 00:06:08.694 02:54:03 -- common/autotest_common.sh@930 -- # kill -0 647652 00:06:08.694 02:54:03 -- common/autotest_common.sh@931 -- # uname 00:06:08.694 02:54:03 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:08.694 02:54:03 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 647652 00:06:08.694 02:54:03 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:08.694 02:54:03 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:08.694 02:54:03 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 647652' 00:06:08.694 killing process with pid 647652 00:06:08.694 02:54:03 -- common/autotest_common.sh@945 -- # kill 647652 00:06:08.694 02:54:03 -- common/autotest_common.sh@950 -- # wait 647652 00:06:08.694 spdk_app_start is called in Round 0. 00:06:08.694 Shutdown signal received, stop current app iteration 00:06:08.694 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 reinitialization... 00:06:08.694 spdk_app_start is called in Round 1. 00:06:08.694 Shutdown signal received, stop current app iteration 00:06:08.694 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 reinitialization... 00:06:08.694 spdk_app_start is called in Round 2. 00:06:08.694 Shutdown signal received, stop current app iteration 00:06:08.694 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 reinitialization... 00:06:08.694 spdk_app_start is called in Round 3. 00:06:08.694 Shutdown signal received, stop current app iteration 00:06:08.694 02:54:03 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:08.694 02:54:03 -- event/event.sh@42 -- # return 0 00:06:08.694 00:06:08.694 real 0m16.231s 00:06:08.694 user 0m34.551s 00:06:08.694 sys 0m3.099s 00:06:08.694 02:54:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.694 02:54:03 -- common/autotest_common.sh@10 -- # set +x 00:06:08.694 ************************************ 00:06:08.694 END TEST app_repeat 00:06:08.694 ************************************ 00:06:08.694 02:54:03 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:08.694 02:54:03 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:08.694 02:54:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:08.694 02:54:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:08.694 02:54:03 -- common/autotest_common.sh@10 -- # set +x 00:06:08.694 ************************************ 00:06:08.694 START TEST cpu_locks 00:06:08.694 ************************************ 00:06:08.694 02:54:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:08.694 * Looking for test storage... 00:06:08.694 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:08.694 02:54:03 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:08.694 02:54:03 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:08.694 02:54:03 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:08.695 02:54:03 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:08.695 02:54:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:08.695 02:54:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:08.695 02:54:03 -- common/autotest_common.sh@10 -- # set +x 00:06:08.695 ************************************ 00:06:08.695 START TEST default_locks 00:06:08.695 ************************************ 00:06:08.695 02:54:03 -- common/autotest_common.sh@1104 -- # default_locks 00:06:08.695 02:54:03 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:08.695 02:54:03 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=650846 00:06:08.695 02:54:03 -- event/cpu_locks.sh@47 -- # waitforlisten 650846 00:06:08.695 02:54:03 -- common/autotest_common.sh@819 -- # '[' -z 650846 ']' 00:06:08.695 02:54:03 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:08.695 02:54:03 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:08.695 02:54:03 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:08.695 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:08.695 02:54:03 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:08.695 02:54:03 -- common/autotest_common.sh@10 -- # set +x 00:06:08.953 [2024-07-14 02:54:03.950780] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:08.953 [2024-07-14 02:54:03.950859] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid650846 ] 00:06:08.953 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.953 [2024-07-14 02:54:04.019150] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.953 [2024-07-14 02:54:04.056588] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:08.953 [2024-07-14 02:54:04.056697] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.566 02:54:04 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:09.566 02:54:04 -- common/autotest_common.sh@852 -- # return 0 00:06:09.566 02:54:04 -- event/cpu_locks.sh@49 -- # locks_exist 650846 00:06:09.566 02:54:04 -- event/cpu_locks.sh@22 -- # lslocks -p 650846 00:06:09.566 02:54:04 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:09.861 lslocks: write error 00:06:09.861 02:54:05 -- event/cpu_locks.sh@50 -- # killprocess 650846 00:06:09.861 02:54:05 -- common/autotest_common.sh@926 -- # '[' -z 650846 ']' 00:06:09.861 02:54:05 -- common/autotest_common.sh@930 -- # kill -0 650846 00:06:09.861 02:54:05 -- common/autotest_common.sh@931 -- # uname 00:06:09.861 02:54:05 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:09.861 02:54:05 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 650846 00:06:09.861 02:54:05 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:09.861 02:54:05 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:09.861 02:54:05 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 650846' 00:06:09.861 killing process with pid 650846 00:06:09.861 02:54:05 -- common/autotest_common.sh@945 -- # kill 650846 00:06:09.861 02:54:05 -- common/autotest_common.sh@950 -- # wait 650846 00:06:10.430 02:54:05 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 650846 00:06:10.430 02:54:05 -- common/autotest_common.sh@640 -- # local es=0 00:06:10.430 02:54:05 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 650846 00:06:10.430 02:54:05 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:10.430 02:54:05 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:10.430 02:54:05 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:10.430 02:54:05 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:10.430 02:54:05 -- common/autotest_common.sh@643 -- # waitforlisten 650846 00:06:10.430 02:54:05 -- common/autotest_common.sh@819 -- # '[' -z 650846 ']' 00:06:10.430 02:54:05 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.430 02:54:05 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:10.430 02:54:05 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.430 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.430 02:54:05 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:10.430 02:54:05 -- common/autotest_common.sh@10 -- # set +x 00:06:10.430 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (650846) - No such process 00:06:10.430 ERROR: process (pid: 650846) is no longer running 00:06:10.430 02:54:05 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:10.430 02:54:05 -- common/autotest_common.sh@852 -- # return 1 00:06:10.430 02:54:05 -- common/autotest_common.sh@643 -- # es=1 00:06:10.430 02:54:05 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:10.430 02:54:05 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:10.430 02:54:05 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:10.430 02:54:05 -- event/cpu_locks.sh@54 -- # no_locks 00:06:10.430 02:54:05 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:10.430 02:54:05 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:10.430 02:54:05 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:10.430 00:06:10.430 real 0m1.469s 00:06:10.430 user 0m1.509s 00:06:10.430 sys 0m0.535s 00:06:10.430 02:54:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:10.430 02:54:05 -- common/autotest_common.sh@10 -- # set +x 00:06:10.430 ************************************ 00:06:10.430 END TEST default_locks 00:06:10.430 ************************************ 00:06:10.430 02:54:05 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:10.430 02:54:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:10.431 02:54:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:10.431 02:54:05 -- common/autotest_common.sh@10 -- # set +x 00:06:10.431 ************************************ 00:06:10.431 START TEST default_locks_via_rpc 00:06:10.431 ************************************ 00:06:10.431 02:54:05 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:06:10.431 02:54:05 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=651481 00:06:10.431 02:54:05 -- event/cpu_locks.sh@63 -- # waitforlisten 651481 00:06:10.431 02:54:05 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:10.431 02:54:05 -- common/autotest_common.sh@819 -- # '[' -z 651481 ']' 00:06:10.431 02:54:05 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.431 02:54:05 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:10.431 02:54:05 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.431 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.431 02:54:05 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:10.431 02:54:05 -- common/autotest_common.sh@10 -- # set +x 00:06:10.431 [2024-07-14 02:54:05.461995] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:10.431 [2024-07-14 02:54:05.462093] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid651481 ] 00:06:10.431 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.431 [2024-07-14 02:54:05.531670] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.431 [2024-07-14 02:54:05.567714] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:10.431 [2024-07-14 02:54:05.567829] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.370 02:54:06 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:11.370 02:54:06 -- common/autotest_common.sh@852 -- # return 0 00:06:11.370 02:54:06 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:11.370 02:54:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:11.370 02:54:06 -- common/autotest_common.sh@10 -- # set +x 00:06:11.370 02:54:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:11.370 02:54:06 -- event/cpu_locks.sh@67 -- # no_locks 00:06:11.370 02:54:06 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:11.370 02:54:06 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:11.370 02:54:06 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:11.370 02:54:06 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:11.370 02:54:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:11.370 02:54:06 -- common/autotest_common.sh@10 -- # set +x 00:06:11.370 02:54:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:11.370 02:54:06 -- event/cpu_locks.sh@71 -- # locks_exist 651481 00:06:11.370 02:54:06 -- event/cpu_locks.sh@22 -- # lslocks -p 651481 00:06:11.370 02:54:06 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:11.939 02:54:06 -- event/cpu_locks.sh@73 -- # killprocess 651481 00:06:11.939 02:54:06 -- common/autotest_common.sh@926 -- # '[' -z 651481 ']' 00:06:11.939 02:54:06 -- common/autotest_common.sh@930 -- # kill -0 651481 00:06:11.939 02:54:06 -- common/autotest_common.sh@931 -- # uname 00:06:11.939 02:54:06 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:11.939 02:54:06 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 651481 00:06:11.939 02:54:06 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:11.939 02:54:06 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:11.939 02:54:06 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 651481' 00:06:11.939 killing process with pid 651481 00:06:11.939 02:54:06 -- common/autotest_common.sh@945 -- # kill 651481 00:06:11.939 02:54:06 -- common/autotest_common.sh@950 -- # wait 651481 00:06:12.198 00:06:12.198 real 0m1.769s 00:06:12.198 user 0m1.828s 00:06:12.198 sys 0m0.624s 00:06:12.198 02:54:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.198 02:54:07 -- common/autotest_common.sh@10 -- # set +x 00:06:12.198 ************************************ 00:06:12.198 END TEST default_locks_via_rpc 00:06:12.198 ************************************ 00:06:12.198 02:54:07 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:12.198 02:54:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:12.198 02:54:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:12.198 02:54:07 -- common/autotest_common.sh@10 -- # set +x 00:06:12.198 ************************************ 00:06:12.198 START TEST non_locking_app_on_locked_coremask 00:06:12.198 ************************************ 00:06:12.198 02:54:07 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:06:12.198 02:54:07 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=651969 00:06:12.198 02:54:07 -- event/cpu_locks.sh@81 -- # waitforlisten 651969 /var/tmp/spdk.sock 00:06:12.198 02:54:07 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:12.198 02:54:07 -- common/autotest_common.sh@819 -- # '[' -z 651969 ']' 00:06:12.198 02:54:07 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:12.198 02:54:07 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:12.198 02:54:07 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:12.198 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:12.198 02:54:07 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:12.198 02:54:07 -- common/autotest_common.sh@10 -- # set +x 00:06:12.198 [2024-07-14 02:54:07.273059] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:12.198 [2024-07-14 02:54:07.273146] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid651969 ] 00:06:12.198 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.198 [2024-07-14 02:54:07.342281] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.198 [2024-07-14 02:54:07.379224] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:12.198 [2024-07-14 02:54:07.379335] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.138 02:54:08 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:13.138 02:54:08 -- common/autotest_common.sh@852 -- # return 0 00:06:13.138 02:54:08 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=652018 00:06:13.138 02:54:08 -- event/cpu_locks.sh@85 -- # waitforlisten 652018 /var/tmp/spdk2.sock 00:06:13.138 02:54:08 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:13.138 02:54:08 -- common/autotest_common.sh@819 -- # '[' -z 652018 ']' 00:06:13.138 02:54:08 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:13.138 02:54:08 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:13.138 02:54:08 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:13.138 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:13.138 02:54:08 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:13.138 02:54:08 -- common/autotest_common.sh@10 -- # set +x 00:06:13.138 [2024-07-14 02:54:08.104674] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:13.138 [2024-07-14 02:54:08.104762] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid652018 ] 00:06:13.138 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.138 [2024-07-14 02:54:08.191047] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:13.138 [2024-07-14 02:54:08.191072] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.138 [2024-07-14 02:54:08.263505] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:13.138 [2024-07-14 02:54:08.263620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.708 02:54:08 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:13.708 02:54:08 -- common/autotest_common.sh@852 -- # return 0 00:06:13.708 02:54:08 -- event/cpu_locks.sh@87 -- # locks_exist 651969 00:06:13.708 02:54:08 -- event/cpu_locks.sh@22 -- # lslocks -p 651969 00:06:13.708 02:54:08 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:15.087 lslocks: write error 00:06:15.087 02:54:10 -- event/cpu_locks.sh@89 -- # killprocess 651969 00:06:15.087 02:54:10 -- common/autotest_common.sh@926 -- # '[' -z 651969 ']' 00:06:15.087 02:54:10 -- common/autotest_common.sh@930 -- # kill -0 651969 00:06:15.087 02:54:10 -- common/autotest_common.sh@931 -- # uname 00:06:15.087 02:54:10 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:15.087 02:54:10 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 651969 00:06:15.087 02:54:10 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:15.087 02:54:10 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:15.087 02:54:10 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 651969' 00:06:15.087 killing process with pid 651969 00:06:15.087 02:54:10 -- common/autotest_common.sh@945 -- # kill 651969 00:06:15.087 02:54:10 -- common/autotest_common.sh@950 -- # wait 651969 00:06:15.655 02:54:10 -- event/cpu_locks.sh@90 -- # killprocess 652018 00:06:15.655 02:54:10 -- common/autotest_common.sh@926 -- # '[' -z 652018 ']' 00:06:15.655 02:54:10 -- common/autotest_common.sh@930 -- # kill -0 652018 00:06:15.655 02:54:10 -- common/autotest_common.sh@931 -- # uname 00:06:15.655 02:54:10 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:15.655 02:54:10 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 652018 00:06:15.655 02:54:10 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:15.655 02:54:10 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:15.655 02:54:10 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 652018' 00:06:15.655 killing process with pid 652018 00:06:15.655 02:54:10 -- common/autotest_common.sh@945 -- # kill 652018 00:06:15.655 02:54:10 -- common/autotest_common.sh@950 -- # wait 652018 00:06:15.915 00:06:15.915 real 0m3.745s 00:06:15.915 user 0m4.001s 00:06:15.915 sys 0m1.251s 00:06:15.915 02:54:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.915 02:54:10 -- common/autotest_common.sh@10 -- # set +x 00:06:15.915 ************************************ 00:06:15.915 END TEST non_locking_app_on_locked_coremask 00:06:15.915 ************************************ 00:06:15.915 02:54:11 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:15.915 02:54:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:15.915 02:54:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:15.915 02:54:11 -- common/autotest_common.sh@10 -- # set +x 00:06:15.915 ************************************ 00:06:15.915 START TEST locking_app_on_unlocked_coremask 00:06:15.915 ************************************ 00:06:15.915 02:54:11 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:06:15.915 02:54:11 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:15.915 02:54:11 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=652596 00:06:15.915 02:54:11 -- event/cpu_locks.sh@99 -- # waitforlisten 652596 /var/tmp/spdk.sock 00:06:15.915 02:54:11 -- common/autotest_common.sh@819 -- # '[' -z 652596 ']' 00:06:15.915 02:54:11 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:15.915 02:54:11 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:15.915 02:54:11 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:15.915 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:15.915 02:54:11 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:15.915 02:54:11 -- common/autotest_common.sh@10 -- # set +x 00:06:15.915 [2024-07-14 02:54:11.049937] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:15.915 [2024-07-14 02:54:11.050012] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid652596 ] 00:06:15.915 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.915 [2024-07-14 02:54:11.113761] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:15.915 [2024-07-14 02:54:11.113785] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.915 [2024-07-14 02:54:11.151227] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:15.915 [2024-07-14 02:54:11.151336] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.851 02:54:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:16.851 02:54:11 -- common/autotest_common.sh@852 -- # return 0 00:06:16.851 02:54:11 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=652824 00:06:16.851 02:54:11 -- event/cpu_locks.sh@103 -- # waitforlisten 652824 /var/tmp/spdk2.sock 00:06:16.851 02:54:11 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:16.851 02:54:11 -- common/autotest_common.sh@819 -- # '[' -z 652824 ']' 00:06:16.851 02:54:11 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:16.851 02:54:11 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:16.851 02:54:11 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:16.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:16.851 02:54:11 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:16.851 02:54:11 -- common/autotest_common.sh@10 -- # set +x 00:06:16.851 [2024-07-14 02:54:11.898415] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:16.851 [2024-07-14 02:54:11.898484] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid652824 ] 00:06:16.851 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.851 [2024-07-14 02:54:11.988159] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.851 [2024-07-14 02:54:12.061085] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.851 [2024-07-14 02:54:12.061190] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.788 02:54:12 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:17.788 02:54:12 -- common/autotest_common.sh@852 -- # return 0 00:06:17.788 02:54:12 -- event/cpu_locks.sh@105 -- # locks_exist 652824 00:06:17.788 02:54:12 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:17.788 02:54:12 -- event/cpu_locks.sh@22 -- # lslocks -p 652824 00:06:18.354 lslocks: write error 00:06:18.354 02:54:13 -- event/cpu_locks.sh@107 -- # killprocess 652596 00:06:18.354 02:54:13 -- common/autotest_common.sh@926 -- # '[' -z 652596 ']' 00:06:18.354 02:54:13 -- common/autotest_common.sh@930 -- # kill -0 652596 00:06:18.354 02:54:13 -- common/autotest_common.sh@931 -- # uname 00:06:18.354 02:54:13 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:18.354 02:54:13 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 652596 00:06:18.354 02:54:13 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:18.354 02:54:13 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:18.354 02:54:13 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 652596' 00:06:18.354 killing process with pid 652596 00:06:18.354 02:54:13 -- common/autotest_common.sh@945 -- # kill 652596 00:06:18.354 02:54:13 -- common/autotest_common.sh@950 -- # wait 652596 00:06:18.918 02:54:13 -- event/cpu_locks.sh@108 -- # killprocess 652824 00:06:18.918 02:54:13 -- common/autotest_common.sh@926 -- # '[' -z 652824 ']' 00:06:18.918 02:54:13 -- common/autotest_common.sh@930 -- # kill -0 652824 00:06:18.918 02:54:13 -- common/autotest_common.sh@931 -- # uname 00:06:18.919 02:54:13 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:18.919 02:54:13 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 652824 00:06:18.919 02:54:13 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:18.919 02:54:13 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:18.919 02:54:13 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 652824' 00:06:18.919 killing process with pid 652824 00:06:18.919 02:54:13 -- common/autotest_common.sh@945 -- # kill 652824 00:06:18.919 02:54:13 -- common/autotest_common.sh@950 -- # wait 652824 00:06:19.176 00:06:19.176 real 0m3.244s 00:06:19.176 user 0m3.432s 00:06:19.176 sys 0m1.059s 00:06:19.176 02:54:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.176 02:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:19.176 ************************************ 00:06:19.176 END TEST locking_app_on_unlocked_coremask 00:06:19.176 ************************************ 00:06:19.176 02:54:14 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:19.176 02:54:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:19.176 02:54:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.176 02:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:19.176 ************************************ 00:06:19.176 START TEST locking_app_on_locked_coremask 00:06:19.176 ************************************ 00:06:19.176 02:54:14 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:06:19.176 02:54:14 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=653173 00:06:19.176 02:54:14 -- event/cpu_locks.sh@116 -- # waitforlisten 653173 /var/tmp/spdk.sock 00:06:19.176 02:54:14 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:19.176 02:54:14 -- common/autotest_common.sh@819 -- # '[' -z 653173 ']' 00:06:19.176 02:54:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.176 02:54:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:19.176 02:54:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.176 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.176 02:54:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:19.176 02:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:19.176 [2024-07-14 02:54:14.344503] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:19.176 [2024-07-14 02:54:14.344594] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid653173 ] 00:06:19.176 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.176 [2024-07-14 02:54:14.411475] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.434 [2024-07-14 02:54:14.448391] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:19.434 [2024-07-14 02:54:14.448504] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.001 02:54:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:20.001 02:54:15 -- common/autotest_common.sh@852 -- # return 0 00:06:20.001 02:54:15 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:20.001 02:54:15 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=653437 00:06:20.001 02:54:15 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 653437 /var/tmp/spdk2.sock 00:06:20.001 02:54:15 -- common/autotest_common.sh@640 -- # local es=0 00:06:20.001 02:54:15 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 653437 /var/tmp/spdk2.sock 00:06:20.001 02:54:15 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:20.001 02:54:15 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:20.001 02:54:15 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:20.001 02:54:15 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:20.001 02:54:15 -- common/autotest_common.sh@643 -- # waitforlisten 653437 /var/tmp/spdk2.sock 00:06:20.001 02:54:15 -- common/autotest_common.sh@819 -- # '[' -z 653437 ']' 00:06:20.001 02:54:15 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:20.001 02:54:15 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:20.001 02:54:15 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:20.001 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:20.001 02:54:15 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:20.001 02:54:15 -- common/autotest_common.sh@10 -- # set +x 00:06:20.001 [2024-07-14 02:54:15.166461] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:20.001 [2024-07-14 02:54:15.166528] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid653437 ] 00:06:20.001 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.001 [2024-07-14 02:54:15.250372] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 653173 has claimed it. 00:06:20.001 [2024-07-14 02:54:15.250404] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:20.568 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (653437) - No such process 00:06:20.568 ERROR: process (pid: 653437) is no longer running 00:06:20.568 02:54:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:20.568 02:54:15 -- common/autotest_common.sh@852 -- # return 1 00:06:20.568 02:54:15 -- common/autotest_common.sh@643 -- # es=1 00:06:20.568 02:54:15 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:20.568 02:54:15 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:20.568 02:54:15 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:20.568 02:54:15 -- event/cpu_locks.sh@122 -- # locks_exist 653173 00:06:20.827 02:54:15 -- event/cpu_locks.sh@22 -- # lslocks -p 653173 00:06:20.827 02:54:15 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:21.396 lslocks: write error 00:06:21.396 02:54:16 -- event/cpu_locks.sh@124 -- # killprocess 653173 00:06:21.396 02:54:16 -- common/autotest_common.sh@926 -- # '[' -z 653173 ']' 00:06:21.396 02:54:16 -- common/autotest_common.sh@930 -- # kill -0 653173 00:06:21.396 02:54:16 -- common/autotest_common.sh@931 -- # uname 00:06:21.396 02:54:16 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:21.396 02:54:16 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 653173 00:06:21.396 02:54:16 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:21.396 02:54:16 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:21.396 02:54:16 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 653173' 00:06:21.396 killing process with pid 653173 00:06:21.396 02:54:16 -- common/autotest_common.sh@945 -- # kill 653173 00:06:21.396 02:54:16 -- common/autotest_common.sh@950 -- # wait 653173 00:06:21.655 00:06:21.655 real 0m2.462s 00:06:21.655 user 0m2.663s 00:06:21.655 sys 0m0.743s 00:06:21.655 02:54:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.655 02:54:16 -- common/autotest_common.sh@10 -- # set +x 00:06:21.655 ************************************ 00:06:21.655 END TEST locking_app_on_locked_coremask 00:06:21.655 ************************************ 00:06:21.656 02:54:16 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:21.656 02:54:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:21.656 02:54:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:21.656 02:54:16 -- common/autotest_common.sh@10 -- # set +x 00:06:21.656 ************************************ 00:06:21.656 START TEST locking_overlapped_coremask 00:06:21.656 ************************************ 00:06:21.656 02:54:16 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:06:21.656 02:54:16 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=653746 00:06:21.656 02:54:16 -- event/cpu_locks.sh@133 -- # waitforlisten 653746 /var/tmp/spdk.sock 00:06:21.656 02:54:16 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:21.656 02:54:16 -- common/autotest_common.sh@819 -- # '[' -z 653746 ']' 00:06:21.656 02:54:16 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:21.656 02:54:16 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:21.656 02:54:16 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:21.656 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:21.656 02:54:16 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:21.656 02:54:16 -- common/autotest_common.sh@10 -- # set +x 00:06:21.656 [2024-07-14 02:54:16.850600] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:21.656 [2024-07-14 02:54:16.850689] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid653746 ] 00:06:21.656 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.915 [2024-07-14 02:54:16.918398] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:21.915 [2024-07-14 02:54:16.956591] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:21.915 [2024-07-14 02:54:16.956731] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.915 [2024-07-14 02:54:16.956822] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.915 [2024-07-14 02:54:16.956824] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:22.483 02:54:17 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:22.483 02:54:17 -- common/autotest_common.sh@852 -- # return 0 00:06:22.483 02:54:17 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=653791 00:06:22.483 02:54:17 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 653791 /var/tmp/spdk2.sock 00:06:22.483 02:54:17 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:22.483 02:54:17 -- common/autotest_common.sh@640 -- # local es=0 00:06:22.483 02:54:17 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 653791 /var/tmp/spdk2.sock 00:06:22.483 02:54:17 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:22.483 02:54:17 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:22.483 02:54:17 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:22.483 02:54:17 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:22.483 02:54:17 -- common/autotest_common.sh@643 -- # waitforlisten 653791 /var/tmp/spdk2.sock 00:06:22.484 02:54:17 -- common/autotest_common.sh@819 -- # '[' -z 653791 ']' 00:06:22.484 02:54:17 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:22.484 02:54:17 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:22.484 02:54:17 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:22.484 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:22.484 02:54:17 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:22.484 02:54:17 -- common/autotest_common.sh@10 -- # set +x 00:06:22.484 [2024-07-14 02:54:17.695998] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:22.484 [2024-07-14 02:54:17.696061] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid653791 ] 00:06:22.484 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.743 [2024-07-14 02:54:17.793419] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 653746 has claimed it. 00:06:22.743 [2024-07-14 02:54:17.793461] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:23.311 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (653791) - No such process 00:06:23.311 ERROR: process (pid: 653791) is no longer running 00:06:23.311 02:54:18 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:23.311 02:54:18 -- common/autotest_common.sh@852 -- # return 1 00:06:23.311 02:54:18 -- common/autotest_common.sh@643 -- # es=1 00:06:23.311 02:54:18 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:23.311 02:54:18 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:23.311 02:54:18 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:23.311 02:54:18 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:23.311 02:54:18 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:23.311 02:54:18 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:23.311 02:54:18 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:23.311 02:54:18 -- event/cpu_locks.sh@141 -- # killprocess 653746 00:06:23.311 02:54:18 -- common/autotest_common.sh@926 -- # '[' -z 653746 ']' 00:06:23.311 02:54:18 -- common/autotest_common.sh@930 -- # kill -0 653746 00:06:23.311 02:54:18 -- common/autotest_common.sh@931 -- # uname 00:06:23.311 02:54:18 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:23.311 02:54:18 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 653746 00:06:23.311 02:54:18 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:23.311 02:54:18 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:23.311 02:54:18 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 653746' 00:06:23.311 killing process with pid 653746 00:06:23.311 02:54:18 -- common/autotest_common.sh@945 -- # kill 653746 00:06:23.311 02:54:18 -- common/autotest_common.sh@950 -- # wait 653746 00:06:23.570 00:06:23.570 real 0m1.867s 00:06:23.570 user 0m5.352s 00:06:23.570 sys 0m0.443s 00:06:23.570 02:54:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.570 02:54:18 -- common/autotest_common.sh@10 -- # set +x 00:06:23.570 ************************************ 00:06:23.570 END TEST locking_overlapped_coremask 00:06:23.570 ************************************ 00:06:23.570 02:54:18 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:23.570 02:54:18 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:23.570 02:54:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:23.570 02:54:18 -- common/autotest_common.sh@10 -- # set +x 00:06:23.570 ************************************ 00:06:23.570 START TEST locking_overlapped_coremask_via_rpc 00:06:23.570 ************************************ 00:06:23.570 02:54:18 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:06:23.570 02:54:18 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=654046 00:06:23.570 02:54:18 -- event/cpu_locks.sh@149 -- # waitforlisten 654046 /var/tmp/spdk.sock 00:06:23.570 02:54:18 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:23.570 02:54:18 -- common/autotest_common.sh@819 -- # '[' -z 654046 ']' 00:06:23.570 02:54:18 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.570 02:54:18 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:23.570 02:54:18 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.570 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.570 02:54:18 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:23.570 02:54:18 -- common/autotest_common.sh@10 -- # set +x 00:06:23.570 [2024-07-14 02:54:18.767811] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:23.570 [2024-07-14 02:54:18.767904] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654046 ] 00:06:23.571 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.830 [2024-07-14 02:54:18.834220] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:23.830 [2024-07-14 02:54:18.834254] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:23.830 [2024-07-14 02:54:18.867628] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:23.830 [2024-07-14 02:54:18.867775] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.830 [2024-07-14 02:54:18.867875] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:23.830 [2024-07-14 02:54:18.867877] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.398 02:54:19 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:24.398 02:54:19 -- common/autotest_common.sh@852 -- # return 0 00:06:24.398 02:54:19 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=654289 00:06:24.398 02:54:19 -- event/cpu_locks.sh@153 -- # waitforlisten 654289 /var/tmp/spdk2.sock 00:06:24.398 02:54:19 -- common/autotest_common.sh@819 -- # '[' -z 654289 ']' 00:06:24.398 02:54:19 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:24.398 02:54:19 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:24.398 02:54:19 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:24.398 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:24.398 02:54:19 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:24.398 02:54:19 -- common/autotest_common.sh@10 -- # set +x 00:06:24.398 02:54:19 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:24.398 [2024-07-14 02:54:19.599390] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:24.398 [2024-07-14 02:54:19.599460] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654289 ] 00:06:24.398 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.658 [2024-07-14 02:54:19.691422] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:24.658 [2024-07-14 02:54:19.691457] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:24.658 [2024-07-14 02:54:19.764992] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:24.658 [2024-07-14 02:54:19.765143] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:24.658 [2024-07-14 02:54:19.768490] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:24.658 [2024-07-14 02:54:19.768491] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:25.226 02:54:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.226 02:54:20 -- common/autotest_common.sh@852 -- # return 0 00:06:25.226 02:54:20 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:25.226 02:54:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:25.226 02:54:20 -- common/autotest_common.sh@10 -- # set +x 00:06:25.226 02:54:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:25.226 02:54:20 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:25.226 02:54:20 -- common/autotest_common.sh@640 -- # local es=0 00:06:25.226 02:54:20 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:25.226 02:54:20 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:06:25.226 02:54:20 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:25.226 02:54:20 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:06:25.226 02:54:20 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:25.226 02:54:20 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:25.226 02:54:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:25.226 02:54:20 -- common/autotest_common.sh@10 -- # set +x 00:06:25.226 [2024-07-14 02:54:20.437507] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 654046 has claimed it. 00:06:25.226 request: 00:06:25.226 { 00:06:25.226 "method": "framework_enable_cpumask_locks", 00:06:25.226 "req_id": 1 00:06:25.226 } 00:06:25.226 Got JSON-RPC error response 00:06:25.226 response: 00:06:25.226 { 00:06:25.226 "code": -32603, 00:06:25.226 "message": "Failed to claim CPU core: 2" 00:06:25.226 } 00:06:25.226 02:54:20 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:06:25.226 02:54:20 -- common/autotest_common.sh@643 -- # es=1 00:06:25.226 02:54:20 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:25.226 02:54:20 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:25.226 02:54:20 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:25.226 02:54:20 -- event/cpu_locks.sh@158 -- # waitforlisten 654046 /var/tmp/spdk.sock 00:06:25.226 02:54:20 -- common/autotest_common.sh@819 -- # '[' -z 654046 ']' 00:06:25.226 02:54:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.226 02:54:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:25.226 02:54:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.226 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.226 02:54:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:25.226 02:54:20 -- common/autotest_common.sh@10 -- # set +x 00:06:25.485 02:54:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.485 02:54:20 -- common/autotest_common.sh@852 -- # return 0 00:06:25.485 02:54:20 -- event/cpu_locks.sh@159 -- # waitforlisten 654289 /var/tmp/spdk2.sock 00:06:25.485 02:54:20 -- common/autotest_common.sh@819 -- # '[' -z 654289 ']' 00:06:25.485 02:54:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:25.485 02:54:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:25.485 02:54:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:25.485 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:25.485 02:54:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:25.485 02:54:20 -- common/autotest_common.sh@10 -- # set +x 00:06:25.744 02:54:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.744 02:54:20 -- common/autotest_common.sh@852 -- # return 0 00:06:25.744 02:54:20 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:25.744 02:54:20 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:25.744 02:54:20 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:25.744 02:54:20 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:25.744 00:06:25.744 real 0m2.070s 00:06:25.744 user 0m0.807s 00:06:25.744 sys 0m0.195s 00:06:25.744 02:54:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:25.744 02:54:20 -- common/autotest_common.sh@10 -- # set +x 00:06:25.744 ************************************ 00:06:25.744 END TEST locking_overlapped_coremask_via_rpc 00:06:25.744 ************************************ 00:06:25.744 02:54:20 -- event/cpu_locks.sh@174 -- # cleanup 00:06:25.744 02:54:20 -- event/cpu_locks.sh@15 -- # [[ -z 654046 ]] 00:06:25.744 02:54:20 -- event/cpu_locks.sh@15 -- # killprocess 654046 00:06:25.744 02:54:20 -- common/autotest_common.sh@926 -- # '[' -z 654046 ']' 00:06:25.744 02:54:20 -- common/autotest_common.sh@930 -- # kill -0 654046 00:06:25.744 02:54:20 -- common/autotest_common.sh@931 -- # uname 00:06:25.744 02:54:20 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:25.744 02:54:20 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 654046 00:06:25.744 02:54:20 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:25.744 02:54:20 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:25.744 02:54:20 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 654046' 00:06:25.744 killing process with pid 654046 00:06:25.744 02:54:20 -- common/autotest_common.sh@945 -- # kill 654046 00:06:25.744 02:54:20 -- common/autotest_common.sh@950 -- # wait 654046 00:06:26.003 02:54:21 -- event/cpu_locks.sh@16 -- # [[ -z 654289 ]] 00:06:26.003 02:54:21 -- event/cpu_locks.sh@16 -- # killprocess 654289 00:06:26.003 02:54:21 -- common/autotest_common.sh@926 -- # '[' -z 654289 ']' 00:06:26.003 02:54:21 -- common/autotest_common.sh@930 -- # kill -0 654289 00:06:26.003 02:54:21 -- common/autotest_common.sh@931 -- # uname 00:06:26.003 02:54:21 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:26.003 02:54:21 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 654289 00:06:26.262 02:54:21 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:06:26.262 02:54:21 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:06:26.262 02:54:21 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 654289' 00:06:26.262 killing process with pid 654289 00:06:26.262 02:54:21 -- common/autotest_common.sh@945 -- # kill 654289 00:06:26.262 02:54:21 -- common/autotest_common.sh@950 -- # wait 654289 00:06:26.521 02:54:21 -- event/cpu_locks.sh@18 -- # rm -f 00:06:26.521 02:54:21 -- event/cpu_locks.sh@1 -- # cleanup 00:06:26.521 02:54:21 -- event/cpu_locks.sh@15 -- # [[ -z 654046 ]] 00:06:26.521 02:54:21 -- event/cpu_locks.sh@15 -- # killprocess 654046 00:06:26.521 02:54:21 -- common/autotest_common.sh@926 -- # '[' -z 654046 ']' 00:06:26.521 02:54:21 -- common/autotest_common.sh@930 -- # kill -0 654046 00:06:26.521 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (654046) - No such process 00:06:26.521 02:54:21 -- common/autotest_common.sh@953 -- # echo 'Process with pid 654046 is not found' 00:06:26.521 Process with pid 654046 is not found 00:06:26.521 02:54:21 -- event/cpu_locks.sh@16 -- # [[ -z 654289 ]] 00:06:26.521 02:54:21 -- event/cpu_locks.sh@16 -- # killprocess 654289 00:06:26.521 02:54:21 -- common/autotest_common.sh@926 -- # '[' -z 654289 ']' 00:06:26.521 02:54:21 -- common/autotest_common.sh@930 -- # kill -0 654289 00:06:26.521 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (654289) - No such process 00:06:26.521 02:54:21 -- common/autotest_common.sh@953 -- # echo 'Process with pid 654289 is not found' 00:06:26.521 Process with pid 654289 is not found 00:06:26.521 02:54:21 -- event/cpu_locks.sh@18 -- # rm -f 00:06:26.521 00:06:26.521 real 0m17.760s 00:06:26.521 user 0m30.219s 00:06:26.521 sys 0m5.728s 00:06:26.521 02:54:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.521 02:54:21 -- common/autotest_common.sh@10 -- # set +x 00:06:26.521 ************************************ 00:06:26.521 END TEST cpu_locks 00:06:26.521 ************************************ 00:06:26.521 00:06:26.521 real 0m42.246s 00:06:26.521 user 1m18.875s 00:06:26.521 sys 0m9.720s 00:06:26.521 02:54:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.521 02:54:21 -- common/autotest_common.sh@10 -- # set +x 00:06:26.521 ************************************ 00:06:26.521 END TEST event 00:06:26.521 ************************************ 00:06:26.521 02:54:21 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:26.522 02:54:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:26.522 02:54:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:26.522 02:54:21 -- common/autotest_common.sh@10 -- # set +x 00:06:26.522 ************************************ 00:06:26.522 START TEST thread 00:06:26.522 ************************************ 00:06:26.522 02:54:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:26.522 * Looking for test storage... 00:06:26.522 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:26.522 02:54:21 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:26.522 02:54:21 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:26.522 02:54:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:26.522 02:54:21 -- common/autotest_common.sh@10 -- # set +x 00:06:26.781 ************************************ 00:06:26.781 START TEST thread_poller_perf 00:06:26.781 ************************************ 00:06:26.781 02:54:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:26.781 [2024-07-14 02:54:21.790821] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:26.781 [2024-07-14 02:54:21.790915] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654690 ] 00:06:26.781 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.781 [2024-07-14 02:54:21.861043] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.781 [2024-07-14 02:54:21.897670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.781 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:27.717 ====================================== 00:06:27.718 busy:2505225854 (cyc) 00:06:27.718 total_run_count: 810000 00:06:27.718 tsc_hz: 2500000000 (cyc) 00:06:27.718 ====================================== 00:06:27.718 poller_cost: 3092 (cyc), 1236 (nsec) 00:06:27.718 00:06:27.718 real 0m1.180s 00:06:27.718 user 0m1.086s 00:06:27.718 sys 0m0.089s 00:06:27.718 02:54:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.718 02:54:22 -- common/autotest_common.sh@10 -- # set +x 00:06:27.718 ************************************ 00:06:27.718 END TEST thread_poller_perf 00:06:27.718 ************************************ 00:06:27.977 02:54:22 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:27.977 02:54:22 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:27.977 02:54:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:27.977 02:54:22 -- common/autotest_common.sh@10 -- # set +x 00:06:27.977 ************************************ 00:06:27.977 START TEST thread_poller_perf 00:06:27.977 ************************************ 00:06:27.977 02:54:23 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:27.977 [2024-07-14 02:54:23.009885] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:27.977 [2024-07-14 02:54:23.009949] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654980 ] 00:06:27.977 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.977 [2024-07-14 02:54:23.074276] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.977 [2024-07-14 02:54:23.108680] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.977 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:28.911 ====================================== 00:06:28.911 busy:2501906460 (cyc) 00:06:28.911 total_run_count: 13547000 00:06:28.911 tsc_hz: 2500000000 (cyc) 00:06:28.911 ====================================== 00:06:28.911 poller_cost: 184 (cyc), 73 (nsec) 00:06:28.911 00:06:28.911 real 0m1.161s 00:06:28.911 user 0m1.080s 00:06:28.911 sys 0m0.077s 00:06:28.911 02:54:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:28.911 02:54:24 -- common/autotest_common.sh@10 -- # set +x 00:06:28.911 ************************************ 00:06:28.911 END TEST thread_poller_perf 00:06:28.911 ************************************ 00:06:29.171 02:54:24 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:29.171 02:54:24 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:29.171 02:54:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:29.171 02:54:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:29.171 02:54:24 -- common/autotest_common.sh@10 -- # set +x 00:06:29.171 ************************************ 00:06:29.171 START TEST thread_spdk_lock 00:06:29.171 ************************************ 00:06:29.171 02:54:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:29.171 [2024-07-14 02:54:24.218980] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:29.171 [2024-07-14 02:54:24.219044] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid655233 ] 00:06:29.171 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.171 [2024-07-14 02:54:24.282456] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:29.171 [2024-07-14 02:54:24.317547] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:29.171 [2024-07-14 02:54:24.317551] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.739 [2024-07-14 02:54:24.809509] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.739 [2024-07-14 02:54:24.809548] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:29.739 [2024-07-14 02:54:24.809558] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x12f6280 00:06:29.739 [2024-07-14 02:54:24.810311] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.739 [2024-07-14 02:54:24.810417] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.739 [2024-07-14 02:54:24.810438] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.739 Starting test contend 00:06:29.739 Worker Delay Wait us Hold us Total us 00:06:29.739 0 3 166002 187628 353631 00:06:29.739 1 5 87674 287898 375572 00:06:29.739 PASS test contend 00:06:29.739 Starting test hold_by_poller 00:06:29.739 PASS test hold_by_poller 00:06:29.739 Starting test hold_by_message 00:06:29.739 PASS test hold_by_message 00:06:29.739 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:29.739 100014 assertions passed 00:06:29.739 0 assertions failed 00:06:29.739 00:06:29.739 real 0m0.651s 00:06:29.740 user 0m1.067s 00:06:29.740 sys 0m0.073s 00:06:29.740 02:54:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.740 02:54:24 -- common/autotest_common.sh@10 -- # set +x 00:06:29.740 ************************************ 00:06:29.740 END TEST thread_spdk_lock 00:06:29.740 ************************************ 00:06:29.740 00:06:29.740 real 0m3.232s 00:06:29.740 user 0m3.326s 00:06:29.740 sys 0m0.421s 00:06:29.740 02:54:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.740 02:54:24 -- common/autotest_common.sh@10 -- # set +x 00:06:29.740 ************************************ 00:06:29.740 END TEST thread 00:06:29.740 ************************************ 00:06:29.740 02:54:24 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:29.740 02:54:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:29.740 02:54:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:29.740 02:54:24 -- common/autotest_common.sh@10 -- # set +x 00:06:29.740 ************************************ 00:06:29.740 START TEST accel 00:06:29.740 ************************************ 00:06:29.740 02:54:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:29.998 * Looking for test storage... 00:06:29.998 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:29.998 02:54:25 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:29.998 02:54:25 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:29.998 02:54:25 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:29.998 02:54:25 -- accel/accel.sh@59 -- # spdk_tgt_pid=655332 00:06:29.999 02:54:25 -- accel/accel.sh@60 -- # waitforlisten 655332 00:06:29.999 02:54:25 -- common/autotest_common.sh@819 -- # '[' -z 655332 ']' 00:06:29.999 02:54:25 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:29.999 02:54:25 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:29.999 02:54:25 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:29.999 02:54:25 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:29.999 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:29.999 02:54:25 -- accel/accel.sh@58 -- # build_accel_config 00:06:29.999 02:54:25 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:29.999 02:54:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.999 02:54:25 -- common/autotest_common.sh@10 -- # set +x 00:06:29.999 02:54:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.999 02:54:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.999 02:54:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.999 02:54:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.999 02:54:25 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.999 02:54:25 -- accel/accel.sh@42 -- # jq -r . 00:06:29.999 [2024-07-14 02:54:25.073234] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:29.999 [2024-07-14 02:54:25.073324] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid655332 ] 00:06:29.999 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.999 [2024-07-14 02:54:25.141732] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.999 [2024-07-14 02:54:25.177902] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:29.999 [2024-07-14 02:54:25.178013] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.935 02:54:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:30.935 02:54:25 -- common/autotest_common.sh@852 -- # return 0 00:06:30.935 02:54:25 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:30.935 02:54:25 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:30.935 02:54:25 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:30.935 02:54:25 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:30.935 02:54:25 -- common/autotest_common.sh@10 -- # set +x 00:06:30.935 02:54:25 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # IFS== 00:06:30.935 02:54:25 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.935 02:54:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.935 02:54:25 -- accel/accel.sh@67 -- # killprocess 655332 00:06:30.935 02:54:25 -- common/autotest_common.sh@926 -- # '[' -z 655332 ']' 00:06:30.935 02:54:25 -- common/autotest_common.sh@930 -- # kill -0 655332 00:06:30.935 02:54:25 -- common/autotest_common.sh@931 -- # uname 00:06:30.935 02:54:25 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:30.935 02:54:25 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 655332 00:06:30.935 02:54:25 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:30.935 02:54:25 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:30.935 02:54:25 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 655332' 00:06:30.935 killing process with pid 655332 00:06:30.935 02:54:25 -- common/autotest_common.sh@945 -- # kill 655332 00:06:30.935 02:54:25 -- common/autotest_common.sh@950 -- # wait 655332 00:06:31.194 02:54:26 -- accel/accel.sh@68 -- # trap - ERR 00:06:31.194 02:54:26 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:31.194 02:54:26 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:31.194 02:54:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.194 02:54:26 -- common/autotest_common.sh@10 -- # set +x 00:06:31.194 02:54:26 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:31.194 02:54:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:31.194 02:54:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.194 02:54:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.194 02:54:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.194 02:54:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.194 02:54:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.194 02:54:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.194 02:54:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.194 02:54:26 -- accel/accel.sh@42 -- # jq -r . 00:06:31.194 02:54:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.194 02:54:26 -- common/autotest_common.sh@10 -- # set +x 00:06:31.194 02:54:26 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:31.194 02:54:26 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:31.194 02:54:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.194 02:54:26 -- common/autotest_common.sh@10 -- # set +x 00:06:31.194 ************************************ 00:06:31.194 START TEST accel_missing_filename 00:06:31.194 ************************************ 00:06:31.194 02:54:26 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:31.194 02:54:26 -- common/autotest_common.sh@640 -- # local es=0 00:06:31.194 02:54:26 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:31.194 02:54:26 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:31.194 02:54:26 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.194 02:54:26 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:31.194 02:54:26 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.194 02:54:26 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:31.194 02:54:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:31.194 02:54:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.194 02:54:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.194 02:54:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.194 02:54:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.194 02:54:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.194 02:54:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.194 02:54:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.194 02:54:26 -- accel/accel.sh@42 -- # jq -r . 00:06:31.194 [2024-07-14 02:54:26.368262] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:31.194 [2024-07-14 02:54:26.368375] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid655632 ] 00:06:31.194 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.194 [2024-07-14 02:54:26.438272] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.453 [2024-07-14 02:54:26.473574] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.453 [2024-07-14 02:54:26.512827] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:31.453 [2024-07-14 02:54:26.572598] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:31.453 A filename is required. 00:06:31.453 02:54:26 -- common/autotest_common.sh@643 -- # es=234 00:06:31.453 02:54:26 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:31.453 02:54:26 -- common/autotest_common.sh@652 -- # es=106 00:06:31.453 02:54:26 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:31.453 02:54:26 -- common/autotest_common.sh@660 -- # es=1 00:06:31.453 02:54:26 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:31.453 00:06:31.453 real 0m0.283s 00:06:31.453 user 0m0.192s 00:06:31.453 sys 0m0.129s 00:06:31.453 02:54:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.453 02:54:26 -- common/autotest_common.sh@10 -- # set +x 00:06:31.453 ************************************ 00:06:31.453 END TEST accel_missing_filename 00:06:31.453 ************************************ 00:06:31.453 02:54:26 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.453 02:54:26 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:31.453 02:54:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.453 02:54:26 -- common/autotest_common.sh@10 -- # set +x 00:06:31.453 ************************************ 00:06:31.453 START TEST accel_compress_verify 00:06:31.453 ************************************ 00:06:31.453 02:54:26 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.453 02:54:26 -- common/autotest_common.sh@640 -- # local es=0 00:06:31.453 02:54:26 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.453 02:54:26 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:31.453 02:54:26 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.454 02:54:26 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:31.454 02:54:26 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.454 02:54:26 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.454 02:54:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.454 02:54:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.454 02:54:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.454 02:54:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.454 02:54:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.454 02:54:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.454 02:54:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.454 02:54:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.454 02:54:26 -- accel/accel.sh@42 -- # jq -r . 00:06:31.454 [2024-07-14 02:54:26.686511] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:31.454 [2024-07-14 02:54:26.686600] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid655652 ] 00:06:31.713 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.713 [2024-07-14 02:54:26.756446] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.713 [2024-07-14 02:54:26.791216] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.713 [2024-07-14 02:54:26.830103] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:31.713 [2024-07-14 02:54:26.889029] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:31.713 00:06:31.713 Compression does not support the verify option, aborting. 00:06:31.713 02:54:26 -- common/autotest_common.sh@643 -- # es=161 00:06:31.713 02:54:26 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:31.713 02:54:26 -- common/autotest_common.sh@652 -- # es=33 00:06:31.713 02:54:26 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:31.713 02:54:26 -- common/autotest_common.sh@660 -- # es=1 00:06:31.713 02:54:26 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:31.713 00:06:31.713 real 0m0.282s 00:06:31.713 user 0m0.190s 00:06:31.713 sys 0m0.129s 00:06:31.713 02:54:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.713 02:54:26 -- common/autotest_common.sh@10 -- # set +x 00:06:31.713 ************************************ 00:06:31.713 END TEST accel_compress_verify 00:06:31.713 ************************************ 00:06:31.972 02:54:26 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:31.972 02:54:26 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:31.972 02:54:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.972 02:54:26 -- common/autotest_common.sh@10 -- # set +x 00:06:31.972 ************************************ 00:06:31.972 START TEST accel_wrong_workload 00:06:31.972 ************************************ 00:06:31.972 02:54:26 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:31.972 02:54:26 -- common/autotest_common.sh@640 -- # local es=0 00:06:31.972 02:54:26 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:31.972 02:54:26 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:31.972 02:54:26 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.972 02:54:26 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:31.972 02:54:26 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.972 02:54:26 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:31.972 02:54:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:31.972 02:54:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.972 02:54:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.973 02:54:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.973 02:54:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.973 02:54:26 -- accel/accel.sh@42 -- # jq -r . 00:06:31.973 Unsupported workload type: foobar 00:06:31.973 [2024-07-14 02:54:27.007323] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:31.973 accel_perf options: 00:06:31.973 [-h help message] 00:06:31.973 [-q queue depth per core] 00:06:31.973 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:31.973 [-T number of threads per core 00:06:31.973 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:31.973 [-t time in seconds] 00:06:31.973 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:31.973 [ dif_verify, , dif_generate, dif_generate_copy 00:06:31.973 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:31.973 [-l for compress/decompress workloads, name of uncompressed input file 00:06:31.973 [-S for crc32c workload, use this seed value (default 0) 00:06:31.973 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:31.973 [-f for fill workload, use this BYTE value (default 255) 00:06:31.973 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:31.973 [-y verify result if this switch is on] 00:06:31.973 [-a tasks to allocate per core (default: same value as -q)] 00:06:31.973 Can be used to spread operations across a wider range of memory. 00:06:31.973 02:54:27 -- common/autotest_common.sh@643 -- # es=1 00:06:31.973 02:54:27 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:31.973 02:54:27 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:31.973 02:54:27 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:31.973 00:06:31.973 real 0m0.025s 00:06:31.973 user 0m0.009s 00:06:31.973 sys 0m0.016s 00:06:31.973 02:54:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.973 02:54:27 -- common/autotest_common.sh@10 -- # set +x 00:06:31.973 ************************************ 00:06:31.973 END TEST accel_wrong_workload 00:06:31.973 ************************************ 00:06:31.973 Error: writing output failed: Broken pipe 00:06:31.973 02:54:27 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:31.973 02:54:27 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:31.973 02:54:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.973 02:54:27 -- common/autotest_common.sh@10 -- # set +x 00:06:31.973 ************************************ 00:06:31.973 START TEST accel_negative_buffers 00:06:31.973 ************************************ 00:06:31.973 02:54:27 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:31.973 02:54:27 -- common/autotest_common.sh@640 -- # local es=0 00:06:31.973 02:54:27 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:31.973 02:54:27 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:31.973 02:54:27 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.973 02:54:27 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:31.973 02:54:27 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.973 02:54:27 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:31.973 02:54:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:31.973 02:54:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.973 02:54:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.973 02:54:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.973 02:54:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.973 02:54:27 -- accel/accel.sh@42 -- # jq -r . 00:06:31.973 -x option must be non-negative. 00:06:31.973 [2024-07-14 02:54:27.072271] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:31.973 accel_perf options: 00:06:31.973 [-h help message] 00:06:31.973 [-q queue depth per core] 00:06:31.973 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:31.973 [-T number of threads per core 00:06:31.973 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:31.973 [-t time in seconds] 00:06:31.973 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:31.973 [ dif_verify, , dif_generate, dif_generate_copy 00:06:31.973 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:31.973 [-l for compress/decompress workloads, name of uncompressed input file 00:06:31.973 [-S for crc32c workload, use this seed value (default 0) 00:06:31.973 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:31.973 [-f for fill workload, use this BYTE value (default 255) 00:06:31.973 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:31.973 [-y verify result if this switch is on] 00:06:31.973 [-a tasks to allocate per core (default: same value as -q)] 00:06:31.973 Can be used to spread operations across a wider range of memory. 00:06:31.973 02:54:27 -- common/autotest_common.sh@643 -- # es=1 00:06:31.973 02:54:27 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:31.973 02:54:27 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:31.973 02:54:27 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:31.973 00:06:31.973 real 0m0.023s 00:06:31.973 user 0m0.011s 00:06:31.973 sys 0m0.012s 00:06:31.973 02:54:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.973 02:54:27 -- common/autotest_common.sh@10 -- # set +x 00:06:31.973 ************************************ 00:06:31.973 END TEST accel_negative_buffers 00:06:31.973 ************************************ 00:06:31.973 Error: writing output failed: Broken pipe 00:06:31.973 02:54:27 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:31.973 02:54:27 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:31.973 02:54:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.973 02:54:27 -- common/autotest_common.sh@10 -- # set +x 00:06:31.973 ************************************ 00:06:31.973 START TEST accel_crc32c 00:06:31.973 ************************************ 00:06:31.973 02:54:27 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:31.973 02:54:27 -- accel/accel.sh@16 -- # local accel_opc 00:06:31.973 02:54:27 -- accel/accel.sh@17 -- # local accel_module 00:06:31.973 02:54:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:31.973 02:54:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.973 02:54:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:31.973 02:54:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.973 02:54:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.973 02:54:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.973 02:54:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.973 02:54:27 -- accel/accel.sh@42 -- # jq -r . 00:06:31.973 [2024-07-14 02:54:27.128116] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:31.973 [2024-07-14 02:54:27.128213] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid655847 ] 00:06:31.973 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.973 [2024-07-14 02:54:27.197182] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.232 [2024-07-14 02:54:27.233923] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.168 02:54:28 -- accel/accel.sh@18 -- # out=' 00:06:33.168 SPDK Configuration: 00:06:33.168 Core mask: 0x1 00:06:33.168 00:06:33.168 Accel Perf Configuration: 00:06:33.168 Workload Type: crc32c 00:06:33.168 CRC-32C seed: 32 00:06:33.168 Transfer size: 4096 bytes 00:06:33.168 Vector count 1 00:06:33.168 Module: software 00:06:33.168 Queue depth: 32 00:06:33.168 Allocate depth: 32 00:06:33.168 # threads/core: 1 00:06:33.168 Run time: 1 seconds 00:06:33.168 Verify: Yes 00:06:33.168 00:06:33.168 Running for 1 seconds... 00:06:33.168 00:06:33.168 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:33.168 ------------------------------------------------------------------------------------ 00:06:33.168 0,0 858240/s 3352 MiB/s 0 0 00:06:33.168 ==================================================================================== 00:06:33.168 Total 858240/s 3352 MiB/s 0 0' 00:06:33.168 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.168 02:54:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:33.168 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.168 02:54:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:33.168 02:54:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:33.168 02:54:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:33.168 02:54:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.168 02:54:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.168 02:54:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:33.168 02:54:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:33.168 02:54:28 -- accel/accel.sh@41 -- # local IFS=, 00:06:33.168 02:54:28 -- accel/accel.sh@42 -- # jq -r . 00:06:33.168 [2024-07-14 02:54:28.403065] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:33.168 [2024-07-14 02:54:28.403120] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid655995 ] 00:06:33.428 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.428 [2024-07-14 02:54:28.466265] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.428 [2024-07-14 02:54:28.500911] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val= 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val= 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val=0x1 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val= 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val= 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val=crc32c 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val=32 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val= 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val=software 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@23 -- # accel_module=software 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val=32 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val=32 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val=1 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val=Yes 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val= 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:33.428 02:54:28 -- accel/accel.sh@21 -- # val= 00:06:33.428 02:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # IFS=: 00:06:33.428 02:54:28 -- accel/accel.sh@20 -- # read -r var val 00:06:34.808 02:54:29 -- accel/accel.sh@21 -- # val= 00:06:34.808 02:54:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # IFS=: 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # read -r var val 00:06:34.808 02:54:29 -- accel/accel.sh@21 -- # val= 00:06:34.808 02:54:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # IFS=: 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # read -r var val 00:06:34.808 02:54:29 -- accel/accel.sh@21 -- # val= 00:06:34.808 02:54:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # IFS=: 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # read -r var val 00:06:34.808 02:54:29 -- accel/accel.sh@21 -- # val= 00:06:34.808 02:54:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # IFS=: 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # read -r var val 00:06:34.808 02:54:29 -- accel/accel.sh@21 -- # val= 00:06:34.808 02:54:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.808 02:54:29 -- accel/accel.sh@20 -- # IFS=: 00:06:34.809 02:54:29 -- accel/accel.sh@20 -- # read -r var val 00:06:34.809 02:54:29 -- accel/accel.sh@21 -- # val= 00:06:34.809 02:54:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.809 02:54:29 -- accel/accel.sh@20 -- # IFS=: 00:06:34.809 02:54:29 -- accel/accel.sh@20 -- # read -r var val 00:06:34.809 02:54:29 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:34.809 02:54:29 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:34.809 02:54:29 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:34.809 00:06:34.809 real 0m2.555s 00:06:34.809 user 0m2.306s 00:06:34.809 sys 0m0.248s 00:06:34.809 02:54:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.809 02:54:29 -- common/autotest_common.sh@10 -- # set +x 00:06:34.809 ************************************ 00:06:34.809 END TEST accel_crc32c 00:06:34.809 ************************************ 00:06:34.809 02:54:29 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:34.809 02:54:29 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:34.809 02:54:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:34.809 02:54:29 -- common/autotest_common.sh@10 -- # set +x 00:06:34.809 ************************************ 00:06:34.809 START TEST accel_crc32c_C2 00:06:34.809 ************************************ 00:06:34.809 02:54:29 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:34.809 02:54:29 -- accel/accel.sh@16 -- # local accel_opc 00:06:34.809 02:54:29 -- accel/accel.sh@17 -- # local accel_module 00:06:34.809 02:54:29 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:34.809 02:54:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:34.809 02:54:29 -- accel/accel.sh@12 -- # build_accel_config 00:06:34.809 02:54:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.809 02:54:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.809 02:54:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.809 02:54:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.809 02:54:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.809 02:54:29 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.809 02:54:29 -- accel/accel.sh@42 -- # jq -r . 00:06:34.809 [2024-07-14 02:54:29.723437] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:34.809 [2024-07-14 02:54:29.723539] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid656269 ] 00:06:34.809 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.809 [2024-07-14 02:54:29.793180] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.809 [2024-07-14 02:54:29.828333] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.877 02:54:30 -- accel/accel.sh@18 -- # out=' 00:06:35.877 SPDK Configuration: 00:06:35.877 Core mask: 0x1 00:06:35.877 00:06:35.877 Accel Perf Configuration: 00:06:35.877 Workload Type: crc32c 00:06:35.877 CRC-32C seed: 0 00:06:35.877 Transfer size: 4096 bytes 00:06:35.877 Vector count 2 00:06:35.877 Module: software 00:06:35.877 Queue depth: 32 00:06:35.877 Allocate depth: 32 00:06:35.877 # threads/core: 1 00:06:35.877 Run time: 1 seconds 00:06:35.877 Verify: Yes 00:06:35.877 00:06:35.877 Running for 1 seconds... 00:06:35.877 00:06:35.877 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:35.877 ------------------------------------------------------------------------------------ 00:06:35.877 0,0 610592/s 4770 MiB/s 0 0 00:06:35.877 ==================================================================================== 00:06:35.877 Total 610592/s 2385 MiB/s 0 0' 00:06:35.877 02:54:30 -- accel/accel.sh@20 -- # IFS=: 00:06:35.877 02:54:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:35.877 02:54:30 -- accel/accel.sh@20 -- # read -r var val 00:06:35.877 02:54:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:35.877 02:54:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.877 02:54:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.877 02:54:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.877 02:54:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.877 02:54:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.877 02:54:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.877 02:54:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.877 02:54:30 -- accel/accel.sh@42 -- # jq -r . 00:06:35.877 [2024-07-14 02:54:30.997172] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:35.877 [2024-07-14 02:54:30.997225] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid656538 ] 00:06:35.877 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.877 [2024-07-14 02:54:31.060210] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.877 [2024-07-14 02:54:31.094364] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val= 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val= 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val=0x1 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val= 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val= 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val=crc32c 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val=0 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val= 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val=software 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@23 -- # accel_module=software 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val=32 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val=32 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val=1 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val=Yes 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val= 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:36.137 02:54:31 -- accel/accel.sh@21 -- # val= 00:06:36.137 02:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # IFS=: 00:06:36.137 02:54:31 -- accel/accel.sh@20 -- # read -r var val 00:06:37.074 02:54:32 -- accel/accel.sh@21 -- # val= 00:06:37.074 02:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # IFS=: 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # read -r var val 00:06:37.074 02:54:32 -- accel/accel.sh@21 -- # val= 00:06:37.074 02:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # IFS=: 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # read -r var val 00:06:37.074 02:54:32 -- accel/accel.sh@21 -- # val= 00:06:37.074 02:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # IFS=: 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # read -r var val 00:06:37.074 02:54:32 -- accel/accel.sh@21 -- # val= 00:06:37.074 02:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # IFS=: 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # read -r var val 00:06:37.074 02:54:32 -- accel/accel.sh@21 -- # val= 00:06:37.074 02:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # IFS=: 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # read -r var val 00:06:37.074 02:54:32 -- accel/accel.sh@21 -- # val= 00:06:37.074 02:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # IFS=: 00:06:37.074 02:54:32 -- accel/accel.sh@20 -- # read -r var val 00:06:37.074 02:54:32 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:37.074 02:54:32 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:37.074 02:54:32 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:37.074 00:06:37.074 real 0m2.555s 00:06:37.074 user 0m2.310s 00:06:37.074 sys 0m0.244s 00:06:37.074 02:54:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.074 02:54:32 -- common/autotest_common.sh@10 -- # set +x 00:06:37.074 ************************************ 00:06:37.074 END TEST accel_crc32c_C2 00:06:37.074 ************************************ 00:06:37.074 02:54:32 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:37.074 02:54:32 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:37.074 02:54:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.074 02:54:32 -- common/autotest_common.sh@10 -- # set +x 00:06:37.074 ************************************ 00:06:37.074 START TEST accel_copy 00:06:37.074 ************************************ 00:06:37.074 02:54:32 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:37.074 02:54:32 -- accel/accel.sh@16 -- # local accel_opc 00:06:37.074 02:54:32 -- accel/accel.sh@17 -- # local accel_module 00:06:37.074 02:54:32 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:37.074 02:54:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:37.074 02:54:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.074 02:54:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.074 02:54:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.074 02:54:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.074 02:54:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.074 02:54:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.074 02:54:32 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.074 02:54:32 -- accel/accel.sh@42 -- # jq -r . 00:06:37.074 [2024-07-14 02:54:32.314355] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:37.074 [2024-07-14 02:54:32.314542] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid656827 ] 00:06:37.333 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.333 [2024-07-14 02:54:32.383261] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.333 [2024-07-14 02:54:32.418394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.712 02:54:33 -- accel/accel.sh@18 -- # out=' 00:06:38.712 SPDK Configuration: 00:06:38.712 Core mask: 0x1 00:06:38.712 00:06:38.712 Accel Perf Configuration: 00:06:38.712 Workload Type: copy 00:06:38.712 Transfer size: 4096 bytes 00:06:38.712 Vector count 1 00:06:38.712 Module: software 00:06:38.712 Queue depth: 32 00:06:38.712 Allocate depth: 32 00:06:38.712 # threads/core: 1 00:06:38.712 Run time: 1 seconds 00:06:38.712 Verify: Yes 00:06:38.712 00:06:38.712 Running for 1 seconds... 00:06:38.712 00:06:38.712 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:38.712 ------------------------------------------------------------------------------------ 00:06:38.712 0,0 539584/s 2107 MiB/s 0 0 00:06:38.712 ==================================================================================== 00:06:38.712 Total 539584/s 2107 MiB/s 0 0' 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:38.712 02:54:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.712 02:54:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.712 02:54:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.712 02:54:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.712 02:54:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.712 02:54:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.712 02:54:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.712 02:54:33 -- accel/accel.sh@42 -- # jq -r . 00:06:38.712 [2024-07-14 02:54:33.587757] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:38.712 [2024-07-14 02:54:33.587810] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid657093 ] 00:06:38.712 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.712 [2024-07-14 02:54:33.650057] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.712 [2024-07-14 02:54:33.684044] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val= 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val= 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val=0x1 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val= 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val= 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val=copy 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val= 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val=software 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@23 -- # accel_module=software 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val=32 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val=32 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val=1 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val=Yes 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val= 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:38.712 02:54:33 -- accel/accel.sh@21 -- # val= 00:06:38.712 02:54:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # IFS=: 00:06:38.712 02:54:33 -- accel/accel.sh@20 -- # read -r var val 00:06:39.652 02:54:34 -- accel/accel.sh@21 -- # val= 00:06:39.652 02:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # IFS=: 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # read -r var val 00:06:39.653 02:54:34 -- accel/accel.sh@21 -- # val= 00:06:39.653 02:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # IFS=: 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # read -r var val 00:06:39.653 02:54:34 -- accel/accel.sh@21 -- # val= 00:06:39.653 02:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # IFS=: 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # read -r var val 00:06:39.653 02:54:34 -- accel/accel.sh@21 -- # val= 00:06:39.653 02:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # IFS=: 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # read -r var val 00:06:39.653 02:54:34 -- accel/accel.sh@21 -- # val= 00:06:39.653 02:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # IFS=: 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # read -r var val 00:06:39.653 02:54:34 -- accel/accel.sh@21 -- # val= 00:06:39.653 02:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # IFS=: 00:06:39.653 02:54:34 -- accel/accel.sh@20 -- # read -r var val 00:06:39.653 02:54:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:39.653 02:54:34 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:39.653 02:54:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:39.653 00:06:39.653 real 0m2.552s 00:06:39.653 user 0m2.314s 00:06:39.653 sys 0m0.237s 00:06:39.653 02:54:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:39.653 02:54:34 -- common/autotest_common.sh@10 -- # set +x 00:06:39.653 ************************************ 00:06:39.653 END TEST accel_copy 00:06:39.653 ************************************ 00:06:39.653 02:54:34 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:39.653 02:54:34 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:39.653 02:54:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:39.653 02:54:34 -- common/autotest_common.sh@10 -- # set +x 00:06:39.653 ************************************ 00:06:39.653 START TEST accel_fill 00:06:39.653 ************************************ 00:06:39.653 02:54:34 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:39.653 02:54:34 -- accel/accel.sh@16 -- # local accel_opc 00:06:39.653 02:54:34 -- accel/accel.sh@17 -- # local accel_module 00:06:39.653 02:54:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:39.653 02:54:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:39.653 02:54:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.653 02:54:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.653 02:54:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.653 02:54:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.653 02:54:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.653 02:54:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.653 02:54:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.653 02:54:34 -- accel/accel.sh@42 -- # jq -r . 00:06:39.653 [2024-07-14 02:54:34.902948] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:39.653 [2024-07-14 02:54:34.903043] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid657287 ] 00:06:39.913 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.913 [2024-07-14 02:54:34.971318] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.913 [2024-07-14 02:54:35.006571] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.291 02:54:36 -- accel/accel.sh@18 -- # out=' 00:06:41.291 SPDK Configuration: 00:06:41.291 Core mask: 0x1 00:06:41.291 00:06:41.291 Accel Perf Configuration: 00:06:41.291 Workload Type: fill 00:06:41.291 Fill pattern: 0x80 00:06:41.291 Transfer size: 4096 bytes 00:06:41.291 Vector count 1 00:06:41.291 Module: software 00:06:41.291 Queue depth: 64 00:06:41.291 Allocate depth: 64 00:06:41.291 # threads/core: 1 00:06:41.291 Run time: 1 seconds 00:06:41.291 Verify: Yes 00:06:41.291 00:06:41.291 Running for 1 seconds... 00:06:41.291 00:06:41.291 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:41.291 ------------------------------------------------------------------------------------ 00:06:41.291 0,0 954944/s 3730 MiB/s 0 0 00:06:41.291 ==================================================================================== 00:06:41.291 Total 954944/s 3730 MiB/s 0 0' 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:41.291 02:54:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:41.291 02:54:36 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.291 02:54:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.291 02:54:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.291 02:54:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.291 02:54:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.291 02:54:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.291 02:54:36 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.291 02:54:36 -- accel/accel.sh@42 -- # jq -r . 00:06:41.291 [2024-07-14 02:54:36.176641] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:41.291 [2024-07-14 02:54:36.176693] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid657426 ] 00:06:41.291 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.291 [2024-07-14 02:54:36.238928] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.291 [2024-07-14 02:54:36.273038] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val= 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val= 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val=0x1 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val= 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val= 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val=fill 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val=0x80 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val= 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val=software 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@23 -- # accel_module=software 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val=64 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val=64 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val=1 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val=Yes 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val= 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:41.291 02:54:36 -- accel/accel.sh@21 -- # val= 00:06:41.291 02:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # IFS=: 00:06:41.291 02:54:36 -- accel/accel.sh@20 -- # read -r var val 00:06:42.229 02:54:37 -- accel/accel.sh@21 -- # val= 00:06:42.229 02:54:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # IFS=: 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # read -r var val 00:06:42.229 02:54:37 -- accel/accel.sh@21 -- # val= 00:06:42.229 02:54:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # IFS=: 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # read -r var val 00:06:42.229 02:54:37 -- accel/accel.sh@21 -- # val= 00:06:42.229 02:54:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # IFS=: 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # read -r var val 00:06:42.229 02:54:37 -- accel/accel.sh@21 -- # val= 00:06:42.229 02:54:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # IFS=: 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # read -r var val 00:06:42.229 02:54:37 -- accel/accel.sh@21 -- # val= 00:06:42.229 02:54:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # IFS=: 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # read -r var val 00:06:42.229 02:54:37 -- accel/accel.sh@21 -- # val= 00:06:42.229 02:54:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # IFS=: 00:06:42.229 02:54:37 -- accel/accel.sh@20 -- # read -r var val 00:06:42.229 02:54:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:42.229 02:54:37 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:42.229 02:54:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:42.229 00:06:42.229 real 0m2.551s 00:06:42.229 user 0m2.305s 00:06:42.229 sys 0m0.246s 00:06:42.229 02:54:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:42.229 02:54:37 -- common/autotest_common.sh@10 -- # set +x 00:06:42.229 ************************************ 00:06:42.229 END TEST accel_fill 00:06:42.229 ************************************ 00:06:42.229 02:54:37 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:42.229 02:54:37 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:42.229 02:54:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:42.229 02:54:37 -- common/autotest_common.sh@10 -- # set +x 00:06:42.229 ************************************ 00:06:42.229 START TEST accel_copy_crc32c 00:06:42.229 ************************************ 00:06:42.229 02:54:37 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:42.229 02:54:37 -- accel/accel.sh@16 -- # local accel_opc 00:06:42.229 02:54:37 -- accel/accel.sh@17 -- # local accel_module 00:06:42.229 02:54:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:42.488 02:54:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:42.488 02:54:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.489 02:54:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.489 02:54:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.489 02:54:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.489 02:54:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.489 02:54:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.489 02:54:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.489 02:54:37 -- accel/accel.sh@42 -- # jq -r . 00:06:42.489 [2024-07-14 02:54:37.497427] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:42.489 [2024-07-14 02:54:37.497649] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid657689 ] 00:06:42.489 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.489 [2024-07-14 02:54:37.566203] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.489 [2024-07-14 02:54:37.599851] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.865 02:54:38 -- accel/accel.sh@18 -- # out=' 00:06:43.865 SPDK Configuration: 00:06:43.865 Core mask: 0x1 00:06:43.865 00:06:43.865 Accel Perf Configuration: 00:06:43.865 Workload Type: copy_crc32c 00:06:43.865 CRC-32C seed: 0 00:06:43.865 Vector size: 4096 bytes 00:06:43.865 Transfer size: 4096 bytes 00:06:43.865 Vector count 1 00:06:43.865 Module: software 00:06:43.865 Queue depth: 32 00:06:43.865 Allocate depth: 32 00:06:43.865 # threads/core: 1 00:06:43.865 Run time: 1 seconds 00:06:43.865 Verify: Yes 00:06:43.865 00:06:43.865 Running for 1 seconds... 00:06:43.865 00:06:43.865 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:43.865 ------------------------------------------------------------------------------------ 00:06:43.865 0,0 430080/s 1680 MiB/s 0 0 00:06:43.865 ==================================================================================== 00:06:43.865 Total 430080/s 1680 MiB/s 0 0' 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:43.865 02:54:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.865 02:54:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.865 02:54:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.865 02:54:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.865 02:54:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.865 02:54:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.865 02:54:38 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.865 02:54:38 -- accel/accel.sh@42 -- # jq -r . 00:06:43.865 [2024-07-14 02:54:38.768776] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:43.865 [2024-07-14 02:54:38.768830] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid657955 ] 00:06:43.865 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.865 [2024-07-14 02:54:38.833059] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.865 [2024-07-14 02:54:38.867096] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val= 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val= 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val=0x1 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val= 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val= 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val=0 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val= 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val=software 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@23 -- # accel_module=software 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val=32 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.865 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.865 02:54:38 -- accel/accel.sh@21 -- # val=32 00:06:43.865 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.866 02:54:38 -- accel/accel.sh@21 -- # val=1 00:06:43.866 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.866 02:54:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:43.866 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.866 02:54:38 -- accel/accel.sh@21 -- # val=Yes 00:06:43.866 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.866 02:54:38 -- accel/accel.sh@21 -- # val= 00:06:43.866 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:43.866 02:54:38 -- accel/accel.sh@21 -- # val= 00:06:43.866 02:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # IFS=: 00:06:43.866 02:54:38 -- accel/accel.sh@20 -- # read -r var val 00:06:44.805 02:54:40 -- accel/accel.sh@21 -- # val= 00:06:44.805 02:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # IFS=: 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # read -r var val 00:06:44.805 02:54:40 -- accel/accel.sh@21 -- # val= 00:06:44.805 02:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # IFS=: 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # read -r var val 00:06:44.805 02:54:40 -- accel/accel.sh@21 -- # val= 00:06:44.805 02:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # IFS=: 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # read -r var val 00:06:44.805 02:54:40 -- accel/accel.sh@21 -- # val= 00:06:44.805 02:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # IFS=: 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # read -r var val 00:06:44.805 02:54:40 -- accel/accel.sh@21 -- # val= 00:06:44.805 02:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # IFS=: 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # read -r var val 00:06:44.805 02:54:40 -- accel/accel.sh@21 -- # val= 00:06:44.805 02:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # IFS=: 00:06:44.805 02:54:40 -- accel/accel.sh@20 -- # read -r var val 00:06:44.805 02:54:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:44.805 02:54:40 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:44.805 02:54:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:44.805 00:06:44.805 real 0m2.554s 00:06:44.805 user 0m2.321s 00:06:44.805 sys 0m0.232s 00:06:44.805 02:54:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.805 02:54:40 -- common/autotest_common.sh@10 -- # set +x 00:06:44.805 ************************************ 00:06:44.805 END TEST accel_copy_crc32c 00:06:44.805 ************************************ 00:06:45.064 02:54:40 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:45.064 02:54:40 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:45.064 02:54:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:45.064 02:54:40 -- common/autotest_common.sh@10 -- # set +x 00:06:45.064 ************************************ 00:06:45.064 START TEST accel_copy_crc32c_C2 00:06:45.064 ************************************ 00:06:45.064 02:54:40 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:45.064 02:54:40 -- accel/accel.sh@16 -- # local accel_opc 00:06:45.064 02:54:40 -- accel/accel.sh@17 -- # local accel_module 00:06:45.064 02:54:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:45.064 02:54:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:45.064 02:54:40 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.064 02:54:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.064 02:54:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.064 02:54:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.064 02:54:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.064 02:54:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.064 02:54:40 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.064 02:54:40 -- accel/accel.sh@42 -- # jq -r . 00:06:45.064 [2024-07-14 02:54:40.095177] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:45.064 [2024-07-14 02:54:40.095288] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid658236 ] 00:06:45.064 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.064 [2024-07-14 02:54:40.165051] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.064 [2024-07-14 02:54:40.201200] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.444 02:54:41 -- accel/accel.sh@18 -- # out=' 00:06:46.444 SPDK Configuration: 00:06:46.444 Core mask: 0x1 00:06:46.444 00:06:46.444 Accel Perf Configuration: 00:06:46.444 Workload Type: copy_crc32c 00:06:46.444 CRC-32C seed: 0 00:06:46.444 Vector size: 4096 bytes 00:06:46.444 Transfer size: 8192 bytes 00:06:46.444 Vector count 2 00:06:46.444 Module: software 00:06:46.444 Queue depth: 32 00:06:46.444 Allocate depth: 32 00:06:46.444 # threads/core: 1 00:06:46.444 Run time: 1 seconds 00:06:46.444 Verify: Yes 00:06:46.444 00:06:46.444 Running for 1 seconds... 00:06:46.444 00:06:46.444 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:46.444 ------------------------------------------------------------------------------------ 00:06:46.444 0,0 306560/s 2395 MiB/s 0 0 00:06:46.444 ==================================================================================== 00:06:46.444 Total 306560/s 1197 MiB/s 0 0' 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:46.444 02:54:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.444 02:54:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.444 02:54:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.444 02:54:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.444 02:54:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.444 02:54:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.444 02:54:41 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.444 02:54:41 -- accel/accel.sh@42 -- # jq -r . 00:06:46.444 [2024-07-14 02:54:41.370951] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:46.444 [2024-07-14 02:54:41.371003] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid658510 ] 00:06:46.444 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.444 [2024-07-14 02:54:41.433069] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.444 [2024-07-14 02:54:41.466869] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val= 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val= 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val=0x1 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val= 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val= 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val=0 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val= 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val=software 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@23 -- # accel_module=software 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val=32 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val=32 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val=1 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val=Yes 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val= 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:46.444 02:54:41 -- accel/accel.sh@21 -- # val= 00:06:46.444 02:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # IFS=: 00:06:46.444 02:54:41 -- accel/accel.sh@20 -- # read -r var val 00:06:47.381 02:54:42 -- accel/accel.sh@21 -- # val= 00:06:47.381 02:54:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # IFS=: 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # read -r var val 00:06:47.381 02:54:42 -- accel/accel.sh@21 -- # val= 00:06:47.381 02:54:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # IFS=: 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # read -r var val 00:06:47.381 02:54:42 -- accel/accel.sh@21 -- # val= 00:06:47.381 02:54:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # IFS=: 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # read -r var val 00:06:47.381 02:54:42 -- accel/accel.sh@21 -- # val= 00:06:47.381 02:54:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # IFS=: 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # read -r var val 00:06:47.381 02:54:42 -- accel/accel.sh@21 -- # val= 00:06:47.381 02:54:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # IFS=: 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # read -r var val 00:06:47.381 02:54:42 -- accel/accel.sh@21 -- # val= 00:06:47.381 02:54:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # IFS=: 00:06:47.381 02:54:42 -- accel/accel.sh@20 -- # read -r var val 00:06:47.381 02:54:42 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:47.381 02:54:42 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:47.381 02:54:42 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.381 00:06:47.381 real 0m2.556s 00:06:47.381 user 0m2.311s 00:06:47.381 sys 0m0.244s 00:06:47.381 02:54:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.381 02:54:42 -- common/autotest_common.sh@10 -- # set +x 00:06:47.381 ************************************ 00:06:47.381 END TEST accel_copy_crc32c_C2 00:06:47.381 ************************************ 00:06:47.640 02:54:42 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:47.640 02:54:42 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:47.640 02:54:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:47.640 02:54:42 -- common/autotest_common.sh@10 -- # set +x 00:06:47.640 ************************************ 00:06:47.640 START TEST accel_dualcast 00:06:47.640 ************************************ 00:06:47.640 02:54:42 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:47.640 02:54:42 -- accel/accel.sh@16 -- # local accel_opc 00:06:47.640 02:54:42 -- accel/accel.sh@17 -- # local accel_module 00:06:47.640 02:54:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:47.640 02:54:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:47.640 02:54:42 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.640 02:54:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.640 02:54:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.640 02:54:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.640 02:54:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.640 02:54:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.640 02:54:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.640 02:54:42 -- accel/accel.sh@42 -- # jq -r . 00:06:47.641 [2024-07-14 02:54:42.686659] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:47.641 [2024-07-14 02:54:42.686748] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid658744 ] 00:06:47.641 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.641 [2024-07-14 02:54:42.755925] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.641 [2024-07-14 02:54:42.790790] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.016 02:54:43 -- accel/accel.sh@18 -- # out=' 00:06:49.016 SPDK Configuration: 00:06:49.016 Core mask: 0x1 00:06:49.016 00:06:49.016 Accel Perf Configuration: 00:06:49.016 Workload Type: dualcast 00:06:49.016 Transfer size: 4096 bytes 00:06:49.016 Vector count 1 00:06:49.016 Module: software 00:06:49.016 Queue depth: 32 00:06:49.016 Allocate depth: 32 00:06:49.016 # threads/core: 1 00:06:49.016 Run time: 1 seconds 00:06:49.016 Verify: Yes 00:06:49.016 00:06:49.016 Running for 1 seconds... 00:06:49.016 00:06:49.016 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:49.016 ------------------------------------------------------------------------------------ 00:06:49.016 0,0 679584/s 2654 MiB/s 0 0 00:06:49.016 ==================================================================================== 00:06:49.016 Total 679584/s 2654 MiB/s 0 0' 00:06:49.016 02:54:43 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:43 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:49.016 02:54:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:49.016 02:54:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.016 02:54:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.016 02:54:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.016 02:54:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.016 02:54:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.016 02:54:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.016 02:54:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.016 02:54:43 -- accel/accel.sh@42 -- # jq -r . 00:06:49.016 [2024-07-14 02:54:43.959904] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:49.016 [2024-07-14 02:54:43.959961] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid658885 ] 00:06:49.016 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.016 [2024-07-14 02:54:44.022333] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.016 [2024-07-14 02:54:44.056614] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val= 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val= 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val=0x1 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val= 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val= 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val=dualcast 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val= 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val=software 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@23 -- # accel_module=software 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val=32 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val=32 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val=1 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val=Yes 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val= 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:49.016 02:54:44 -- accel/accel.sh@21 -- # val= 00:06:49.016 02:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # IFS=: 00:06:49.016 02:54:44 -- accel/accel.sh@20 -- # read -r var val 00:06:50.394 02:54:45 -- accel/accel.sh@21 -- # val= 00:06:50.394 02:54:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # IFS=: 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # read -r var val 00:06:50.394 02:54:45 -- accel/accel.sh@21 -- # val= 00:06:50.394 02:54:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # IFS=: 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # read -r var val 00:06:50.394 02:54:45 -- accel/accel.sh@21 -- # val= 00:06:50.394 02:54:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # IFS=: 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # read -r var val 00:06:50.394 02:54:45 -- accel/accel.sh@21 -- # val= 00:06:50.394 02:54:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # IFS=: 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # read -r var val 00:06:50.394 02:54:45 -- accel/accel.sh@21 -- # val= 00:06:50.394 02:54:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # IFS=: 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # read -r var val 00:06:50.394 02:54:45 -- accel/accel.sh@21 -- # val= 00:06:50.394 02:54:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # IFS=: 00:06:50.394 02:54:45 -- accel/accel.sh@20 -- # read -r var val 00:06:50.394 02:54:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:50.394 02:54:45 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:50.394 02:54:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:50.394 00:06:50.394 real 0m2.551s 00:06:50.395 user 0m2.309s 00:06:50.395 sys 0m0.242s 00:06:50.395 02:54:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.395 02:54:45 -- common/autotest_common.sh@10 -- # set +x 00:06:50.395 ************************************ 00:06:50.395 END TEST accel_dualcast 00:06:50.395 ************************************ 00:06:50.395 02:54:45 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:50.395 02:54:45 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:50.395 02:54:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:50.395 02:54:45 -- common/autotest_common.sh@10 -- # set +x 00:06:50.395 ************************************ 00:06:50.395 START TEST accel_compare 00:06:50.395 ************************************ 00:06:50.395 02:54:45 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:06:50.395 02:54:45 -- accel/accel.sh@16 -- # local accel_opc 00:06:50.395 02:54:45 -- accel/accel.sh@17 -- # local accel_module 00:06:50.395 02:54:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:50.395 02:54:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:50.395 02:54:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.395 02:54:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.395 02:54:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.395 02:54:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.395 02:54:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.395 02:54:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.395 02:54:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.395 02:54:45 -- accel/accel.sh@42 -- # jq -r . 00:06:50.395 [2024-07-14 02:54:45.275550] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:50.395 [2024-07-14 02:54:45.275644] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid659093 ] 00:06:50.395 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.395 [2024-07-14 02:54:45.344917] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.395 [2024-07-14 02:54:45.380392] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.331 02:54:46 -- accel/accel.sh@18 -- # out=' 00:06:51.331 SPDK Configuration: 00:06:51.331 Core mask: 0x1 00:06:51.331 00:06:51.331 Accel Perf Configuration: 00:06:51.331 Workload Type: compare 00:06:51.331 Transfer size: 4096 bytes 00:06:51.331 Vector count 1 00:06:51.331 Module: software 00:06:51.331 Queue depth: 32 00:06:51.331 Allocate depth: 32 00:06:51.331 # threads/core: 1 00:06:51.331 Run time: 1 seconds 00:06:51.331 Verify: Yes 00:06:51.331 00:06:51.331 Running for 1 seconds... 00:06:51.331 00:06:51.331 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:51.331 ------------------------------------------------------------------------------------ 00:06:51.331 0,0 826976/s 3230 MiB/s 0 0 00:06:51.331 ==================================================================================== 00:06:51.331 Total 826976/s 3230 MiB/s 0 0' 00:06:51.331 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.331 02:54:46 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:51.331 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.331 02:54:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:51.331 02:54:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.331 02:54:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.331 02:54:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.331 02:54:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.331 02:54:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.331 02:54:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.331 02:54:46 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.331 02:54:46 -- accel/accel.sh@42 -- # jq -r . 00:06:51.331 [2024-07-14 02:54:46.548899] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:51.331 [2024-07-14 02:54:46.548952] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid659371 ] 00:06:51.331 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.590 [2024-07-14 02:54:46.610615] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.590 [2024-07-14 02:54:46.644524] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val= 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val= 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val=0x1 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val= 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val= 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val=compare 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val= 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val=software 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@23 -- # accel_module=software 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val=32 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val=32 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val=1 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val=Yes 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val= 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:51.590 02:54:46 -- accel/accel.sh@21 -- # val= 00:06:51.590 02:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # IFS=: 00:06:51.590 02:54:46 -- accel/accel.sh@20 -- # read -r var val 00:06:52.966 02:54:47 -- accel/accel.sh@21 -- # val= 00:06:52.967 02:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # IFS=: 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # read -r var val 00:06:52.967 02:54:47 -- accel/accel.sh@21 -- # val= 00:06:52.967 02:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # IFS=: 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # read -r var val 00:06:52.967 02:54:47 -- accel/accel.sh@21 -- # val= 00:06:52.967 02:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # IFS=: 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # read -r var val 00:06:52.967 02:54:47 -- accel/accel.sh@21 -- # val= 00:06:52.967 02:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # IFS=: 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # read -r var val 00:06:52.967 02:54:47 -- accel/accel.sh@21 -- # val= 00:06:52.967 02:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # IFS=: 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # read -r var val 00:06:52.967 02:54:47 -- accel/accel.sh@21 -- # val= 00:06:52.967 02:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # IFS=: 00:06:52.967 02:54:47 -- accel/accel.sh@20 -- # read -r var val 00:06:52.967 02:54:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:52.967 02:54:47 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:52.967 02:54:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.967 00:06:52.967 real 0m2.550s 00:06:52.967 user 0m2.309s 00:06:52.967 sys 0m0.240s 00:06:52.967 02:54:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:52.967 02:54:47 -- common/autotest_common.sh@10 -- # set +x 00:06:52.967 ************************************ 00:06:52.967 END TEST accel_compare 00:06:52.967 ************************************ 00:06:52.967 02:54:47 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:52.967 02:54:47 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:52.967 02:54:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:52.967 02:54:47 -- common/autotest_common.sh@10 -- # set +x 00:06:52.967 ************************************ 00:06:52.967 START TEST accel_xor 00:06:52.967 ************************************ 00:06:52.967 02:54:47 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:06:52.967 02:54:47 -- accel/accel.sh@16 -- # local accel_opc 00:06:52.967 02:54:47 -- accel/accel.sh@17 -- # local accel_module 00:06:52.967 02:54:47 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:52.967 02:54:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:52.967 02:54:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.967 02:54:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.967 02:54:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.967 02:54:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.967 02:54:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.967 02:54:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.967 02:54:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.967 02:54:47 -- accel/accel.sh@42 -- # jq -r . 00:06:52.967 [2024-07-14 02:54:47.868864] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:52.967 [2024-07-14 02:54:47.868973] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid659653 ] 00:06:52.967 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.967 [2024-07-14 02:54:47.940195] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.967 [2024-07-14 02:54:47.976135] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.900 02:54:49 -- accel/accel.sh@18 -- # out=' 00:06:53.900 SPDK Configuration: 00:06:53.900 Core mask: 0x1 00:06:53.900 00:06:53.900 Accel Perf Configuration: 00:06:53.900 Workload Type: xor 00:06:53.900 Source buffers: 2 00:06:53.900 Transfer size: 4096 bytes 00:06:53.900 Vector count 1 00:06:53.900 Module: software 00:06:53.900 Queue depth: 32 00:06:53.900 Allocate depth: 32 00:06:53.900 # threads/core: 1 00:06:53.900 Run time: 1 seconds 00:06:53.900 Verify: Yes 00:06:53.900 00:06:53.900 Running for 1 seconds... 00:06:53.900 00:06:53.900 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.900 ------------------------------------------------------------------------------------ 00:06:53.900 0,0 683872/s 2671 MiB/s 0 0 00:06:53.900 ==================================================================================== 00:06:53.900 Total 683872/s 2671 MiB/s 0 0' 00:06:53.900 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 02:54:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:53.900 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 02:54:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:53.900 02:54:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.900 02:54:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.900 02:54:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.900 02:54:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.900 02:54:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.900 02:54:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.900 02:54:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.900 02:54:49 -- accel/accel.sh@42 -- # jq -r . 00:06:53.900 [2024-07-14 02:54:49.145720] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:53.900 [2024-07-14 02:54:49.145773] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid659919 ] 00:06:54.176 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.176 [2024-07-14 02:54:49.208226] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.176 [2024-07-14 02:54:49.242857] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val= 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val= 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val=0x1 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val= 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val= 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val=xor 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val=2 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val= 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val=software 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@23 -- # accel_module=software 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val=32 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val=32 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val=1 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val=Yes 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val= 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:54.176 02:54:49 -- accel/accel.sh@21 -- # val= 00:06:54.176 02:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # IFS=: 00:06:54.176 02:54:49 -- accel/accel.sh@20 -- # read -r var val 00:06:55.550 02:54:50 -- accel/accel.sh@21 -- # val= 00:06:55.550 02:54:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # IFS=: 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # read -r var val 00:06:55.550 02:54:50 -- accel/accel.sh@21 -- # val= 00:06:55.550 02:54:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # IFS=: 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # read -r var val 00:06:55.550 02:54:50 -- accel/accel.sh@21 -- # val= 00:06:55.550 02:54:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # IFS=: 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # read -r var val 00:06:55.550 02:54:50 -- accel/accel.sh@21 -- # val= 00:06:55.550 02:54:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # IFS=: 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # read -r var val 00:06:55.550 02:54:50 -- accel/accel.sh@21 -- # val= 00:06:55.550 02:54:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # IFS=: 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # read -r var val 00:06:55.550 02:54:50 -- accel/accel.sh@21 -- # val= 00:06:55.550 02:54:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # IFS=: 00:06:55.550 02:54:50 -- accel/accel.sh@20 -- # read -r var val 00:06:55.550 02:54:50 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:55.550 02:54:50 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:55.550 02:54:50 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.550 00:06:55.550 real 0m2.559s 00:06:55.550 user 0m2.306s 00:06:55.550 sys 0m0.251s 00:06:55.550 02:54:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:55.550 02:54:50 -- common/autotest_common.sh@10 -- # set +x 00:06:55.550 ************************************ 00:06:55.550 END TEST accel_xor 00:06:55.550 ************************************ 00:06:55.550 02:54:50 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:55.550 02:54:50 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:55.550 02:54:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:55.550 02:54:50 -- common/autotest_common.sh@10 -- # set +x 00:06:55.550 ************************************ 00:06:55.550 START TEST accel_xor 00:06:55.550 ************************************ 00:06:55.550 02:54:50 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:06:55.550 02:54:50 -- accel/accel.sh@16 -- # local accel_opc 00:06:55.550 02:54:50 -- accel/accel.sh@17 -- # local accel_module 00:06:55.550 02:54:50 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:55.550 02:54:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:55.550 02:54:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.550 02:54:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.550 02:54:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.550 02:54:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.550 02:54:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.550 02:54:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.550 02:54:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.550 02:54:50 -- accel/accel.sh@42 -- # jq -r . 00:06:55.550 [2024-07-14 02:54:50.466169] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:55.550 [2024-07-14 02:54:50.466257] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid660206 ] 00:06:55.550 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.550 [2024-07-14 02:54:50.536736] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.550 [2024-07-14 02:54:50.572542] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.486 02:54:51 -- accel/accel.sh@18 -- # out=' 00:06:56.486 SPDK Configuration: 00:06:56.486 Core mask: 0x1 00:06:56.486 00:06:56.486 Accel Perf Configuration: 00:06:56.486 Workload Type: xor 00:06:56.486 Source buffers: 3 00:06:56.486 Transfer size: 4096 bytes 00:06:56.486 Vector count 1 00:06:56.486 Module: software 00:06:56.486 Queue depth: 32 00:06:56.486 Allocate depth: 32 00:06:56.486 # threads/core: 1 00:06:56.486 Run time: 1 seconds 00:06:56.486 Verify: Yes 00:06:56.486 00:06:56.486 Running for 1 seconds... 00:06:56.486 00:06:56.486 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:56.486 ------------------------------------------------------------------------------------ 00:06:56.486 0,0 646304/s 2524 MiB/s 0 0 00:06:56.486 ==================================================================================== 00:06:56.486 Total 646304/s 2524 MiB/s 0 0' 00:06:56.486 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.486 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.486 02:54:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:56.486 02:54:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:56.486 02:54:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.486 02:54:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.486 02:54:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.486 02:54:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.486 02:54:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.486 02:54:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.745 02:54:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.745 02:54:51 -- accel/accel.sh@42 -- # jq -r . 00:06:56.745 [2024-07-14 02:54:51.744473] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:56.745 [2024-07-14 02:54:51.744538] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid660364 ] 00:06:56.745 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.745 [2024-07-14 02:54:51.806852] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.745 [2024-07-14 02:54:51.841112] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.745 02:54:51 -- accel/accel.sh@21 -- # val= 00:06:56.745 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.745 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.745 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.745 02:54:51 -- accel/accel.sh@21 -- # val= 00:06:56.745 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.745 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.745 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.745 02:54:51 -- accel/accel.sh@21 -- # val=0x1 00:06:56.745 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val= 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val= 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val=xor 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val=3 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val= 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val=software 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@23 -- # accel_module=software 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val=32 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val=32 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val=1 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val=Yes 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val= 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:56.746 02:54:51 -- accel/accel.sh@21 -- # val= 00:06:56.746 02:54:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # IFS=: 00:06:56.746 02:54:51 -- accel/accel.sh@20 -- # read -r var val 00:06:58.126 02:54:53 -- accel/accel.sh@21 -- # val= 00:06:58.126 02:54:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # IFS=: 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # read -r var val 00:06:58.126 02:54:53 -- accel/accel.sh@21 -- # val= 00:06:58.126 02:54:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # IFS=: 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # read -r var val 00:06:58.126 02:54:53 -- accel/accel.sh@21 -- # val= 00:06:58.126 02:54:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # IFS=: 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # read -r var val 00:06:58.126 02:54:53 -- accel/accel.sh@21 -- # val= 00:06:58.126 02:54:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # IFS=: 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # read -r var val 00:06:58.126 02:54:53 -- accel/accel.sh@21 -- # val= 00:06:58.126 02:54:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # IFS=: 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # read -r var val 00:06:58.126 02:54:53 -- accel/accel.sh@21 -- # val= 00:06:58.126 02:54:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # IFS=: 00:06:58.126 02:54:53 -- accel/accel.sh@20 -- # read -r var val 00:06:58.126 02:54:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:58.126 02:54:53 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:58.126 02:54:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.126 00:06:58.126 real 0m2.558s 00:06:58.126 user 0m2.318s 00:06:58.126 sys 0m0.238s 00:06:58.126 02:54:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.126 02:54:53 -- common/autotest_common.sh@10 -- # set +x 00:06:58.126 ************************************ 00:06:58.126 END TEST accel_xor 00:06:58.126 ************************************ 00:06:58.126 02:54:53 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:58.126 02:54:53 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:58.126 02:54:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:58.126 02:54:53 -- common/autotest_common.sh@10 -- # set +x 00:06:58.126 ************************************ 00:06:58.126 START TEST accel_dif_verify 00:06:58.126 ************************************ 00:06:58.126 02:54:53 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:06:58.126 02:54:53 -- accel/accel.sh@16 -- # local accel_opc 00:06:58.126 02:54:53 -- accel/accel.sh@17 -- # local accel_module 00:06:58.126 02:54:53 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:58.126 02:54:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:58.126 02:54:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.126 02:54:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.126 02:54:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.126 02:54:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.126 02:54:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.126 02:54:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.126 02:54:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.126 02:54:53 -- accel/accel.sh@42 -- # jq -r . 00:06:58.126 [2024-07-14 02:54:53.063968] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:58.126 [2024-07-14 02:54:53.064058] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid660533 ] 00:06:58.126 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.126 [2024-07-14 02:54:53.132821] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.126 [2024-07-14 02:54:53.167803] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.504 02:54:54 -- accel/accel.sh@18 -- # out=' 00:06:59.504 SPDK Configuration: 00:06:59.504 Core mask: 0x1 00:06:59.504 00:06:59.504 Accel Perf Configuration: 00:06:59.504 Workload Type: dif_verify 00:06:59.504 Vector size: 4096 bytes 00:06:59.504 Transfer size: 4096 bytes 00:06:59.504 Block size: 512 bytes 00:06:59.504 Metadata size: 8 bytes 00:06:59.504 Vector count 1 00:06:59.504 Module: software 00:06:59.504 Queue depth: 32 00:06:59.504 Allocate depth: 32 00:06:59.504 # threads/core: 1 00:06:59.504 Run time: 1 seconds 00:06:59.504 Verify: No 00:06:59.504 00:06:59.504 Running for 1 seconds... 00:06:59.504 00:06:59.504 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:59.504 ------------------------------------------------------------------------------------ 00:06:59.504 0,0 238784/s 947 MiB/s 0 0 00:06:59.504 ==================================================================================== 00:06:59.504 Total 238784/s 932 MiB/s 0 0' 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.504 02:54:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:59.504 02:54:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:59.504 02:54:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.504 02:54:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.504 02:54:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.504 02:54:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.504 02:54:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.504 02:54:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.504 02:54:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.504 02:54:54 -- accel/accel.sh@42 -- # jq -r . 00:06:59.504 [2024-07-14 02:54:54.337843] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:59.504 [2024-07-14 02:54:54.337910] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid660780 ] 00:06:59.504 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.504 [2024-07-14 02:54:54.401291] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.504 [2024-07-14 02:54:54.435727] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.504 02:54:54 -- accel/accel.sh@21 -- # val= 00:06:59.504 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.504 02:54:54 -- accel/accel.sh@21 -- # val= 00:06:59.504 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.504 02:54:54 -- accel/accel.sh@21 -- # val=0x1 00:06:59.504 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.504 02:54:54 -- accel/accel.sh@21 -- # val= 00:06:59.504 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.504 02:54:54 -- accel/accel.sh@21 -- # val= 00:06:59.504 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.504 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.504 02:54:54 -- accel/accel.sh@21 -- # val=dif_verify 00:06:59.504 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val= 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val=software 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@23 -- # accel_module=software 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val=32 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val=32 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val=1 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val=No 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val= 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:06:59.505 02:54:54 -- accel/accel.sh@21 -- # val= 00:06:59.505 02:54:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # IFS=: 00:06:59.505 02:54:54 -- accel/accel.sh@20 -- # read -r var val 00:07:00.441 02:54:55 -- accel/accel.sh@21 -- # val= 00:07:00.441 02:54:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # IFS=: 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # read -r var val 00:07:00.441 02:54:55 -- accel/accel.sh@21 -- # val= 00:07:00.441 02:54:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # IFS=: 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # read -r var val 00:07:00.441 02:54:55 -- accel/accel.sh@21 -- # val= 00:07:00.441 02:54:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # IFS=: 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # read -r var val 00:07:00.441 02:54:55 -- accel/accel.sh@21 -- # val= 00:07:00.441 02:54:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # IFS=: 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # read -r var val 00:07:00.441 02:54:55 -- accel/accel.sh@21 -- # val= 00:07:00.441 02:54:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # IFS=: 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # read -r var val 00:07:00.441 02:54:55 -- accel/accel.sh@21 -- # val= 00:07:00.441 02:54:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # IFS=: 00:07:00.441 02:54:55 -- accel/accel.sh@20 -- # read -r var val 00:07:00.441 02:54:55 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:00.441 02:54:55 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:00.441 02:54:55 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.441 00:07:00.441 real 0m2.553s 00:07:00.441 user 0m2.306s 00:07:00.441 sys 0m0.245s 00:07:00.441 02:54:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.441 02:54:55 -- common/autotest_common.sh@10 -- # set +x 00:07:00.441 ************************************ 00:07:00.441 END TEST accel_dif_verify 00:07:00.441 ************************************ 00:07:00.441 02:54:55 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:00.441 02:54:55 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:00.441 02:54:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:00.441 02:54:55 -- common/autotest_common.sh@10 -- # set +x 00:07:00.441 ************************************ 00:07:00.441 START TEST accel_dif_generate 00:07:00.441 ************************************ 00:07:00.441 02:54:55 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:07:00.441 02:54:55 -- accel/accel.sh@16 -- # local accel_opc 00:07:00.441 02:54:55 -- accel/accel.sh@17 -- # local accel_module 00:07:00.441 02:54:55 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:00.441 02:54:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:00.441 02:54:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.441 02:54:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.441 02:54:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.441 02:54:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.441 02:54:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.441 02:54:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.442 02:54:55 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.442 02:54:55 -- accel/accel.sh@42 -- # jq -r . 00:07:00.442 [2024-07-14 02:54:55.654939] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:00.442 [2024-07-14 02:54:55.655028] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid661071 ] 00:07:00.442 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.700 [2024-07-14 02:54:55.723256] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.700 [2024-07-14 02:54:55.758580] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.077 02:54:56 -- accel/accel.sh@18 -- # out=' 00:07:02.077 SPDK Configuration: 00:07:02.077 Core mask: 0x1 00:07:02.077 00:07:02.077 Accel Perf Configuration: 00:07:02.077 Workload Type: dif_generate 00:07:02.077 Vector size: 4096 bytes 00:07:02.077 Transfer size: 4096 bytes 00:07:02.077 Block size: 512 bytes 00:07:02.077 Metadata size: 8 bytes 00:07:02.077 Vector count 1 00:07:02.077 Module: software 00:07:02.077 Queue depth: 32 00:07:02.077 Allocate depth: 32 00:07:02.077 # threads/core: 1 00:07:02.077 Run time: 1 seconds 00:07:02.077 Verify: No 00:07:02.077 00:07:02.077 Running for 1 seconds... 00:07:02.077 00:07:02.077 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:02.077 ------------------------------------------------------------------------------------ 00:07:02.077 0,0 290048/s 1150 MiB/s 0 0 00:07:02.077 ==================================================================================== 00:07:02.077 Total 290048/s 1133 MiB/s 0 0' 00:07:02.077 02:54:56 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:56 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:56 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:02.077 02:54:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:02.077 02:54:56 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.077 02:54:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.077 02:54:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.077 02:54:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.077 02:54:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.077 02:54:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.077 02:54:56 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.077 02:54:56 -- accel/accel.sh@42 -- # jq -r . 00:07:02.077 [2024-07-14 02:54:56.928190] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:02.077 [2024-07-14 02:54:56.928255] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid661340 ] 00:07:02.077 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.077 [2024-07-14 02:54:56.991810] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.077 [2024-07-14 02:54:57.025940] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val= 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val= 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val=0x1 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val= 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val= 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val=dif_generate 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.077 02:54:57 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:02.077 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.077 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val= 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val=software 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@23 -- # accel_module=software 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val=32 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val=32 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val=1 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val=No 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val= 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:02.078 02:54:57 -- accel/accel.sh@21 -- # val= 00:07:02.078 02:54:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # IFS=: 00:07:02.078 02:54:57 -- accel/accel.sh@20 -- # read -r var val 00:07:03.017 02:54:58 -- accel/accel.sh@21 -- # val= 00:07:03.017 02:54:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # IFS=: 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # read -r var val 00:07:03.017 02:54:58 -- accel/accel.sh@21 -- # val= 00:07:03.017 02:54:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # IFS=: 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # read -r var val 00:07:03.017 02:54:58 -- accel/accel.sh@21 -- # val= 00:07:03.017 02:54:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # IFS=: 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # read -r var val 00:07:03.017 02:54:58 -- accel/accel.sh@21 -- # val= 00:07:03.017 02:54:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # IFS=: 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # read -r var val 00:07:03.017 02:54:58 -- accel/accel.sh@21 -- # val= 00:07:03.017 02:54:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # IFS=: 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # read -r var val 00:07:03.017 02:54:58 -- accel/accel.sh@21 -- # val= 00:07:03.017 02:54:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # IFS=: 00:07:03.017 02:54:58 -- accel/accel.sh@20 -- # read -r var val 00:07:03.017 02:54:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:03.017 02:54:58 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:03.017 02:54:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:03.017 00:07:03.017 real 0m2.550s 00:07:03.017 user 0m2.302s 00:07:03.017 sys 0m0.246s 00:07:03.017 02:54:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:03.017 02:54:58 -- common/autotest_common.sh@10 -- # set +x 00:07:03.017 ************************************ 00:07:03.017 END TEST accel_dif_generate 00:07:03.017 ************************************ 00:07:03.017 02:54:58 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:03.017 02:54:58 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:03.017 02:54:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:03.017 02:54:58 -- common/autotest_common.sh@10 -- # set +x 00:07:03.017 ************************************ 00:07:03.017 START TEST accel_dif_generate_copy 00:07:03.017 ************************************ 00:07:03.017 02:54:58 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:07:03.017 02:54:58 -- accel/accel.sh@16 -- # local accel_opc 00:07:03.017 02:54:58 -- accel/accel.sh@17 -- # local accel_module 00:07:03.017 02:54:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:03.017 02:54:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:03.017 02:54:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.017 02:54:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.017 02:54:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.017 02:54:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.017 02:54:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.017 02:54:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.017 02:54:58 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.017 02:54:58 -- accel/accel.sh@42 -- # jq -r . 00:07:03.017 [2024-07-14 02:54:58.246226] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:03.017 [2024-07-14 02:54:58.246331] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid661621 ] 00:07:03.276 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.276 [2024-07-14 02:54:58.313864] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.276 [2024-07-14 02:54:58.349083] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.746 02:54:59 -- accel/accel.sh@18 -- # out=' 00:07:04.746 SPDK Configuration: 00:07:04.746 Core mask: 0x1 00:07:04.746 00:07:04.746 Accel Perf Configuration: 00:07:04.746 Workload Type: dif_generate_copy 00:07:04.746 Vector size: 4096 bytes 00:07:04.746 Transfer size: 4096 bytes 00:07:04.746 Vector count 1 00:07:04.746 Module: software 00:07:04.746 Queue depth: 32 00:07:04.746 Allocate depth: 32 00:07:04.746 # threads/core: 1 00:07:04.746 Run time: 1 seconds 00:07:04.746 Verify: No 00:07:04.746 00:07:04.746 Running for 1 seconds... 00:07:04.746 00:07:04.746 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:04.746 ------------------------------------------------------------------------------------ 00:07:04.746 0,0 225344/s 894 MiB/s 0 0 00:07:04.746 ==================================================================================== 00:07:04.746 Total 225344/s 880 MiB/s 0 0' 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:04.746 02:54:59 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.746 02:54:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.746 02:54:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.746 02:54:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:04.746 02:54:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.746 02:54:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.746 02:54:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.746 02:54:59 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.746 02:54:59 -- accel/accel.sh@42 -- # jq -r . 00:07:04.746 [2024-07-14 02:54:59.527726] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:04.746 [2024-07-14 02:54:59.527816] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid661823 ] 00:07:04.746 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.746 [2024-07-14 02:54:59.595648] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.746 [2024-07-14 02:54:59.630657] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val= 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val= 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val=0x1 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val= 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val= 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val= 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val=software 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@23 -- # accel_module=software 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val=32 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val=32 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.746 02:54:59 -- accel/accel.sh@21 -- # val=1 00:07:04.746 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.746 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.747 02:54:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:04.747 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.747 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.747 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.747 02:54:59 -- accel/accel.sh@21 -- # val=No 00:07:04.747 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.747 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.747 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.747 02:54:59 -- accel/accel.sh@21 -- # val= 00:07:04.747 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.747 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.747 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:04.747 02:54:59 -- accel/accel.sh@21 -- # val= 00:07:04.747 02:54:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.747 02:54:59 -- accel/accel.sh@20 -- # IFS=: 00:07:04.747 02:54:59 -- accel/accel.sh@20 -- # read -r var val 00:07:05.685 02:55:00 -- accel/accel.sh@21 -- # val= 00:07:05.685 02:55:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # IFS=: 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # read -r var val 00:07:05.685 02:55:00 -- accel/accel.sh@21 -- # val= 00:07:05.685 02:55:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # IFS=: 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # read -r var val 00:07:05.685 02:55:00 -- accel/accel.sh@21 -- # val= 00:07:05.685 02:55:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # IFS=: 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # read -r var val 00:07:05.685 02:55:00 -- accel/accel.sh@21 -- # val= 00:07:05.685 02:55:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # IFS=: 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # read -r var val 00:07:05.685 02:55:00 -- accel/accel.sh@21 -- # val= 00:07:05.685 02:55:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # IFS=: 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # read -r var val 00:07:05.685 02:55:00 -- accel/accel.sh@21 -- # val= 00:07:05.685 02:55:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # IFS=: 00:07:05.685 02:55:00 -- accel/accel.sh@20 -- # read -r var val 00:07:05.685 02:55:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:05.685 02:55:00 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:05.685 02:55:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.685 00:07:05.685 real 0m2.566s 00:07:05.685 user 0m2.300s 00:07:05.685 sys 0m0.263s 00:07:05.685 02:55:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:05.685 02:55:00 -- common/autotest_common.sh@10 -- # set +x 00:07:05.685 ************************************ 00:07:05.685 END TEST accel_dif_generate_copy 00:07:05.685 ************************************ 00:07:05.685 02:55:00 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:05.685 02:55:00 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.685 02:55:00 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:07:05.685 02:55:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:05.685 02:55:00 -- common/autotest_common.sh@10 -- # set +x 00:07:05.685 ************************************ 00:07:05.685 START TEST accel_comp 00:07:05.685 ************************************ 00:07:05.685 02:55:00 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.685 02:55:00 -- accel/accel.sh@16 -- # local accel_opc 00:07:05.685 02:55:00 -- accel/accel.sh@17 -- # local accel_module 00:07:05.685 02:55:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.685 02:55:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.685 02:55:00 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.685 02:55:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.685 02:55:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.685 02:55:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.685 02:55:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.685 02:55:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.685 02:55:00 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.685 02:55:00 -- accel/accel.sh@42 -- # jq -r . 00:07:05.685 [2024-07-14 02:55:00.851271] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:05.685 [2024-07-14 02:55:00.851360] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid662007 ] 00:07:05.685 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.685 [2024-07-14 02:55:00.918694] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.945 [2024-07-14 02:55:00.954416] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.882 02:55:02 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:06.882 00:07:06.882 SPDK Configuration: 00:07:06.882 Core mask: 0x1 00:07:06.882 00:07:06.882 Accel Perf Configuration: 00:07:06.882 Workload Type: compress 00:07:06.882 Transfer size: 4096 bytes 00:07:06.882 Vector count 1 00:07:06.882 Module: software 00:07:06.882 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:06.882 Queue depth: 32 00:07:06.882 Allocate depth: 32 00:07:06.882 # threads/core: 1 00:07:06.882 Run time: 1 seconds 00:07:06.882 Verify: No 00:07:06.882 00:07:06.882 Running for 1 seconds... 00:07:06.882 00:07:06.882 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:06.882 ------------------------------------------------------------------------------------ 00:07:06.882 0,0 66496/s 277 MiB/s 0 0 00:07:06.882 ==================================================================================== 00:07:06.882 Total 66496/s 259 MiB/s 0 0' 00:07:06.882 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:06.882 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:06.882 02:55:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:06.882 02:55:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:06.882 02:55:02 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.882 02:55:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.882 02:55:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.882 02:55:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.882 02:55:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.882 02:55:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.882 02:55:02 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.882 02:55:02 -- accel/accel.sh@42 -- # jq -r . 00:07:07.141 [2024-07-14 02:55:02.135758] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:07.141 [2024-07-14 02:55:02.135865] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid662201 ] 00:07:07.141 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.141 [2024-07-14 02:55:02.206135] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.141 [2024-07-14 02:55:02.240564] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val= 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val= 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val= 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val=0x1 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val= 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val= 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val=compress 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val= 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val=software 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@23 -- # accel_module=software 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val=32 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val=32 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 02:55:02 -- accel/accel.sh@21 -- # val=1 00:07:07.141 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.142 02:55:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:07.142 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.142 02:55:02 -- accel/accel.sh@21 -- # val=No 00:07:07.142 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.142 02:55:02 -- accel/accel.sh@21 -- # val= 00:07:07.142 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:07.142 02:55:02 -- accel/accel.sh@21 -- # val= 00:07:07.142 02:55:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # IFS=: 00:07:07.142 02:55:02 -- accel/accel.sh@20 -- # read -r var val 00:07:08.525 02:55:03 -- accel/accel.sh@21 -- # val= 00:07:08.525 02:55:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # IFS=: 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # read -r var val 00:07:08.525 02:55:03 -- accel/accel.sh@21 -- # val= 00:07:08.525 02:55:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # IFS=: 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # read -r var val 00:07:08.525 02:55:03 -- accel/accel.sh@21 -- # val= 00:07:08.525 02:55:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # IFS=: 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # read -r var val 00:07:08.525 02:55:03 -- accel/accel.sh@21 -- # val= 00:07:08.525 02:55:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # IFS=: 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # read -r var val 00:07:08.525 02:55:03 -- accel/accel.sh@21 -- # val= 00:07:08.525 02:55:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # IFS=: 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # read -r var val 00:07:08.525 02:55:03 -- accel/accel.sh@21 -- # val= 00:07:08.525 02:55:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # IFS=: 00:07:08.525 02:55:03 -- accel/accel.sh@20 -- # read -r var val 00:07:08.525 02:55:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:08.525 02:55:03 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:08.525 02:55:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.525 00:07:08.525 real 0m2.571s 00:07:08.525 user 0m2.315s 00:07:08.525 sys 0m0.253s 00:07:08.525 02:55:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.525 02:55:03 -- common/autotest_common.sh@10 -- # set +x 00:07:08.525 ************************************ 00:07:08.525 END TEST accel_comp 00:07:08.525 ************************************ 00:07:08.525 02:55:03 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:08.525 02:55:03 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:08.525 02:55:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:08.525 02:55:03 -- common/autotest_common.sh@10 -- # set +x 00:07:08.525 ************************************ 00:07:08.525 START TEST accel_decomp 00:07:08.525 ************************************ 00:07:08.525 02:55:03 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:08.525 02:55:03 -- accel/accel.sh@16 -- # local accel_opc 00:07:08.525 02:55:03 -- accel/accel.sh@17 -- # local accel_module 00:07:08.525 02:55:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:08.525 02:55:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:08.525 02:55:03 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.525 02:55:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.525 02:55:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.525 02:55:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.525 02:55:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.525 02:55:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.525 02:55:03 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.525 02:55:03 -- accel/accel.sh@42 -- # jq -r . 00:07:08.525 [2024-07-14 02:55:03.460854] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:08.525 [2024-07-14 02:55:03.460954] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid662482 ] 00:07:08.525 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.525 [2024-07-14 02:55:03.529383] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.525 [2024-07-14 02:55:03.564323] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.902 02:55:04 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:09.902 00:07:09.902 SPDK Configuration: 00:07:09.902 Core mask: 0x1 00:07:09.902 00:07:09.902 Accel Perf Configuration: 00:07:09.902 Workload Type: decompress 00:07:09.902 Transfer size: 4096 bytes 00:07:09.902 Vector count 1 00:07:09.902 Module: software 00:07:09.902 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:09.902 Queue depth: 32 00:07:09.902 Allocate depth: 32 00:07:09.902 # threads/core: 1 00:07:09.902 Run time: 1 seconds 00:07:09.902 Verify: Yes 00:07:09.902 00:07:09.902 Running for 1 seconds... 00:07:09.902 00:07:09.902 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:09.902 ------------------------------------------------------------------------------------ 00:07:09.902 0,0 93312/s 171 MiB/s 0 0 00:07:09.902 ==================================================================================== 00:07:09.902 Total 93312/s 364 MiB/s 0 0' 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:09.902 02:55:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:09.902 02:55:04 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.902 02:55:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.902 02:55:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.902 02:55:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.902 02:55:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.902 02:55:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.902 02:55:04 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.902 02:55:04 -- accel/accel.sh@42 -- # jq -r . 00:07:09.902 [2024-07-14 02:55:04.737243] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:09.902 [2024-07-14 02:55:04.737303] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid662750 ] 00:07:09.902 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.902 [2024-07-14 02:55:04.802385] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.902 [2024-07-14 02:55:04.837339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val= 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val= 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val= 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val=0x1 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val= 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val= 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val=decompress 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val= 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val=software 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@23 -- # accel_module=software 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val=32 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val=32 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val=1 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val=Yes 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val= 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:09.902 02:55:04 -- accel/accel.sh@21 -- # val= 00:07:09.902 02:55:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # IFS=: 00:07:09.902 02:55:04 -- accel/accel.sh@20 -- # read -r var val 00:07:10.838 02:55:05 -- accel/accel.sh@21 -- # val= 00:07:10.838 02:55:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # IFS=: 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # read -r var val 00:07:10.838 02:55:06 -- accel/accel.sh@21 -- # val= 00:07:10.838 02:55:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # IFS=: 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # read -r var val 00:07:10.838 02:55:06 -- accel/accel.sh@21 -- # val= 00:07:10.838 02:55:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # IFS=: 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # read -r var val 00:07:10.838 02:55:06 -- accel/accel.sh@21 -- # val= 00:07:10.838 02:55:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # IFS=: 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # read -r var val 00:07:10.838 02:55:06 -- accel/accel.sh@21 -- # val= 00:07:10.838 02:55:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # IFS=: 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # read -r var val 00:07:10.838 02:55:06 -- accel/accel.sh@21 -- # val= 00:07:10.838 02:55:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # IFS=: 00:07:10.838 02:55:06 -- accel/accel.sh@20 -- # read -r var val 00:07:10.838 02:55:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:10.838 02:55:06 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:10.838 02:55:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:10.838 00:07:10.838 real 0m2.562s 00:07:10.838 user 0m2.315s 00:07:10.838 sys 0m0.244s 00:07:10.838 02:55:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.838 02:55:06 -- common/autotest_common.sh@10 -- # set +x 00:07:10.838 ************************************ 00:07:10.838 END TEST accel_decomp 00:07:10.838 ************************************ 00:07:10.838 02:55:06 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:10.838 02:55:06 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:10.838 02:55:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:10.838 02:55:06 -- common/autotest_common.sh@10 -- # set +x 00:07:10.838 ************************************ 00:07:10.838 START TEST accel_decmop_full 00:07:10.838 ************************************ 00:07:10.838 02:55:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:10.838 02:55:06 -- accel/accel.sh@16 -- # local accel_opc 00:07:10.838 02:55:06 -- accel/accel.sh@17 -- # local accel_module 00:07:10.838 02:55:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:10.838 02:55:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:10.838 02:55:06 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.838 02:55:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.838 02:55:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.838 02:55:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.838 02:55:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.838 02:55:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.838 02:55:06 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.838 02:55:06 -- accel/accel.sh@42 -- # jq -r . 00:07:10.838 [2024-07-14 02:55:06.064257] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:10.838 [2024-07-14 02:55:06.064345] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663042 ] 00:07:11.097 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.097 [2024-07-14 02:55:06.132780] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.097 [2024-07-14 02:55:06.167711] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.473 02:55:07 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:12.473 00:07:12.473 SPDK Configuration: 00:07:12.473 Core mask: 0x1 00:07:12.473 00:07:12.473 Accel Perf Configuration: 00:07:12.473 Workload Type: decompress 00:07:12.473 Transfer size: 111250 bytes 00:07:12.473 Vector count 1 00:07:12.473 Module: software 00:07:12.473 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.473 Queue depth: 32 00:07:12.473 Allocate depth: 32 00:07:12.473 # threads/core: 1 00:07:12.473 Run time: 1 seconds 00:07:12.473 Verify: Yes 00:07:12.473 00:07:12.473 Running for 1 seconds... 00:07:12.473 00:07:12.473 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:12.473 ------------------------------------------------------------------------------------ 00:07:12.473 0,0 5792/s 239 MiB/s 0 0 00:07:12.473 ==================================================================================== 00:07:12.473 Total 5792/s 614 MiB/s 0 0' 00:07:12.473 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.473 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.473 02:55:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.473 02:55:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.473 02:55:07 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.473 02:55:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.473 02:55:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.474 02:55:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.474 02:55:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.474 02:55:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.474 02:55:07 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.474 02:55:07 -- accel/accel.sh@42 -- # jq -r . 00:07:12.474 [2024-07-14 02:55:07.353984] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:12.474 [2024-07-14 02:55:07.354049] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663308 ] 00:07:12.474 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.474 [2024-07-14 02:55:07.417744] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.474 [2024-07-14 02:55:07.452325] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val= 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val= 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val= 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val=0x1 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val= 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val= 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val=decompress 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val= 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val=software 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@23 -- # accel_module=software 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val=32 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val=32 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val=1 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val=Yes 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val= 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:12.474 02:55:07 -- accel/accel.sh@21 -- # val= 00:07:12.474 02:55:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # IFS=: 00:07:12.474 02:55:07 -- accel/accel.sh@20 -- # read -r var val 00:07:13.410 02:55:08 -- accel/accel.sh@21 -- # val= 00:07:13.410 02:55:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # IFS=: 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # read -r var val 00:07:13.410 02:55:08 -- accel/accel.sh@21 -- # val= 00:07:13.410 02:55:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # IFS=: 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # read -r var val 00:07:13.410 02:55:08 -- accel/accel.sh@21 -- # val= 00:07:13.410 02:55:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # IFS=: 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # read -r var val 00:07:13.410 02:55:08 -- accel/accel.sh@21 -- # val= 00:07:13.410 02:55:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # IFS=: 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # read -r var val 00:07:13.410 02:55:08 -- accel/accel.sh@21 -- # val= 00:07:13.410 02:55:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # IFS=: 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # read -r var val 00:07:13.410 02:55:08 -- accel/accel.sh@21 -- # val= 00:07:13.410 02:55:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # IFS=: 00:07:13.410 02:55:08 -- accel/accel.sh@20 -- # read -r var val 00:07:13.410 02:55:08 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:13.410 02:55:08 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:13.410 02:55:08 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:13.410 00:07:13.410 real 0m2.580s 00:07:13.410 user 0m2.327s 00:07:13.410 sys 0m0.249s 00:07:13.410 02:55:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:13.410 02:55:08 -- common/autotest_common.sh@10 -- # set +x 00:07:13.410 ************************************ 00:07:13.410 END TEST accel_decmop_full 00:07:13.410 ************************************ 00:07:13.410 02:55:08 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:13.410 02:55:08 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:13.410 02:55:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:13.410 02:55:08 -- common/autotest_common.sh@10 -- # set +x 00:07:13.669 ************************************ 00:07:13.669 START TEST accel_decomp_mcore 00:07:13.669 ************************************ 00:07:13.669 02:55:08 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:13.669 02:55:08 -- accel/accel.sh@16 -- # local accel_opc 00:07:13.669 02:55:08 -- accel/accel.sh@17 -- # local accel_module 00:07:13.669 02:55:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:13.669 02:55:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:13.669 02:55:08 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.669 02:55:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.669 02:55:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.669 02:55:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.669 02:55:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.669 02:55:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.669 02:55:08 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.669 02:55:08 -- accel/accel.sh@42 -- # jq -r . 00:07:13.669 [2024-07-14 02:55:08.684344] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:13.669 [2024-07-14 02:55:08.684434] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663527 ] 00:07:13.669 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.669 [2024-07-14 02:55:08.753082] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:13.669 [2024-07-14 02:55:08.791319] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:13.669 [2024-07-14 02:55:08.791416] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:13.669 [2024-07-14 02:55:08.791521] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:13.669 [2024-07-14 02:55:08.791523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.046 02:55:09 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:15.046 00:07:15.046 SPDK Configuration: 00:07:15.046 Core mask: 0xf 00:07:15.046 00:07:15.046 Accel Perf Configuration: 00:07:15.046 Workload Type: decompress 00:07:15.046 Transfer size: 4096 bytes 00:07:15.046 Vector count 1 00:07:15.046 Module: software 00:07:15.046 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.046 Queue depth: 32 00:07:15.046 Allocate depth: 32 00:07:15.046 # threads/core: 1 00:07:15.046 Run time: 1 seconds 00:07:15.046 Verify: Yes 00:07:15.046 00:07:15.046 Running for 1 seconds... 00:07:15.046 00:07:15.046 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:15.046 ------------------------------------------------------------------------------------ 00:07:15.046 0,0 78336/s 144 MiB/s 0 0 00:07:15.046 3,0 79232/s 146 MiB/s 0 0 00:07:15.046 2,0 78720/s 145 MiB/s 0 0 00:07:15.046 1,0 78848/s 145 MiB/s 0 0 00:07:15.046 ==================================================================================== 00:07:15.046 Total 315136/s 1231 MiB/s 0 0' 00:07:15.046 02:55:09 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:09 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:09 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:15.046 02:55:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:15.046 02:55:09 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.046 02:55:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.046 02:55:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.046 02:55:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.046 02:55:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.046 02:55:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.046 02:55:09 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.046 02:55:09 -- accel/accel.sh@42 -- # jq -r . 00:07:15.046 [2024-07-14 02:55:09.983222] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:15.046 [2024-07-14 02:55:09.983307] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663680 ] 00:07:15.046 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.046 [2024-07-14 02:55:10.053982] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:15.046 [2024-07-14 02:55:10.095514] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:15.046 [2024-07-14 02:55:10.095609] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:15.046 [2024-07-14 02:55:10.095702] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:15.046 [2024-07-14 02:55:10.095704] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val= 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val= 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val= 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val=0xf 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val= 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val= 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val=decompress 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val= 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val=software 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@23 -- # accel_module=software 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val=32 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.046 02:55:10 -- accel/accel.sh@21 -- # val=32 00:07:15.046 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.046 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.047 02:55:10 -- accel/accel.sh@21 -- # val=1 00:07:15.047 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.047 02:55:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:15.047 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.047 02:55:10 -- accel/accel.sh@21 -- # val=Yes 00:07:15.047 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.047 02:55:10 -- accel/accel.sh@21 -- # val= 00:07:15.047 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:15.047 02:55:10 -- accel/accel.sh@21 -- # val= 00:07:15.047 02:55:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # IFS=: 00:07:15.047 02:55:10 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@21 -- # val= 00:07:16.421 02:55:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # IFS=: 00:07:16.421 02:55:11 -- accel/accel.sh@20 -- # read -r var val 00:07:16.421 02:55:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:16.421 02:55:11 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:16.421 02:55:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:16.421 00:07:16.421 real 0m2.610s 00:07:16.421 user 0m8.997s 00:07:16.421 sys 0m0.276s 00:07:16.421 02:55:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.421 02:55:11 -- common/autotest_common.sh@10 -- # set +x 00:07:16.421 ************************************ 00:07:16.421 END TEST accel_decomp_mcore 00:07:16.421 ************************************ 00:07:16.421 02:55:11 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:16.421 02:55:11 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:16.421 02:55:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:16.421 02:55:11 -- common/autotest_common.sh@10 -- # set +x 00:07:16.421 ************************************ 00:07:16.421 START TEST accel_decomp_full_mcore 00:07:16.421 ************************************ 00:07:16.421 02:55:11 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:16.421 02:55:11 -- accel/accel.sh@16 -- # local accel_opc 00:07:16.421 02:55:11 -- accel/accel.sh@17 -- # local accel_module 00:07:16.421 02:55:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:16.421 02:55:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:16.421 02:55:11 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.421 02:55:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.421 02:55:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.421 02:55:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.421 02:55:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.421 02:55:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.421 02:55:11 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.421 02:55:11 -- accel/accel.sh@42 -- # jq -r . 00:07:16.421 [2024-07-14 02:55:11.344217] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:16.421 [2024-07-14 02:55:11.344324] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663905 ] 00:07:16.421 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.421 [2024-07-14 02:55:11.417013] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:16.421 [2024-07-14 02:55:11.454241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.421 [2024-07-14 02:55:11.454334] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:16.421 [2024-07-14 02:55:11.454395] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:16.421 [2024-07-14 02:55:11.454396] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.800 02:55:12 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:17.800 00:07:17.800 SPDK Configuration: 00:07:17.800 Core mask: 0xf 00:07:17.800 00:07:17.800 Accel Perf Configuration: 00:07:17.800 Workload Type: decompress 00:07:17.800 Transfer size: 111250 bytes 00:07:17.800 Vector count 1 00:07:17.800 Module: software 00:07:17.800 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:17.800 Queue depth: 32 00:07:17.800 Allocate depth: 32 00:07:17.800 # threads/core: 1 00:07:17.800 Run time: 1 seconds 00:07:17.800 Verify: Yes 00:07:17.800 00:07:17.800 Running for 1 seconds... 00:07:17.800 00:07:17.800 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:17.800 ------------------------------------------------------------------------------------ 00:07:17.800 0,0 5760/s 237 MiB/s 0 0 00:07:17.800 3,0 5824/s 240 MiB/s 0 0 00:07:17.800 2,0 5792/s 239 MiB/s 0 0 00:07:17.800 1,0 5792/s 239 MiB/s 0 0 00:07:17.800 ==================================================================================== 00:07:17.800 Total 23168/s 2458 MiB/s 0 0' 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.800 02:55:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:17.800 02:55:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:17.800 02:55:12 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.800 02:55:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.800 02:55:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.800 02:55:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.800 02:55:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.800 02:55:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.800 02:55:12 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.800 02:55:12 -- accel/accel.sh@42 -- # jq -r . 00:07:17.800 [2024-07-14 02:55:12.655785] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:17.800 [2024-07-14 02:55:12.655875] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid664175 ] 00:07:17.800 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.800 [2024-07-14 02:55:12.724436] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:17.800 [2024-07-14 02:55:12.761108] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.800 [2024-07-14 02:55:12.761203] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:17.800 [2024-07-14 02:55:12.761286] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:17.800 [2024-07-14 02:55:12.761288] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.800 02:55:12 -- accel/accel.sh@21 -- # val= 00:07:17.800 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.800 02:55:12 -- accel/accel.sh@21 -- # val= 00:07:17.800 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.800 02:55:12 -- accel/accel.sh@21 -- # val= 00:07:17.800 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.800 02:55:12 -- accel/accel.sh@21 -- # val=0xf 00:07:17.800 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.800 02:55:12 -- accel/accel.sh@21 -- # val= 00:07:17.800 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.800 02:55:12 -- accel/accel.sh@21 -- # val= 00:07:17.800 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.800 02:55:12 -- accel/accel.sh@21 -- # val=decompress 00:07:17.800 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.800 02:55:12 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.800 02:55:12 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:17.800 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.800 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val= 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val=software 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@23 -- # accel_module=software 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val=32 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val=32 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val=1 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val=Yes 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val= 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:17.801 02:55:12 -- accel/accel.sh@21 -- # val= 00:07:17.801 02:55:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # IFS=: 00:07:17.801 02:55:12 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@21 -- # val= 00:07:18.736 02:55:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # IFS=: 00:07:18.736 02:55:13 -- accel/accel.sh@20 -- # read -r var val 00:07:18.736 02:55:13 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:18.736 02:55:13 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:18.736 02:55:13 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.736 00:07:18.736 real 0m2.627s 00:07:18.736 user 0m9.053s 00:07:18.736 sys 0m0.286s 00:07:18.736 02:55:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.736 02:55:13 -- common/autotest_common.sh@10 -- # set +x 00:07:18.736 ************************************ 00:07:18.736 END TEST accel_decomp_full_mcore 00:07:18.736 ************************************ 00:07:18.995 02:55:13 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:18.995 02:55:13 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:18.995 02:55:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:18.995 02:55:13 -- common/autotest_common.sh@10 -- # set +x 00:07:18.995 ************************************ 00:07:18.995 START TEST accel_decomp_mthread 00:07:18.995 ************************************ 00:07:18.995 02:55:13 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:18.995 02:55:13 -- accel/accel.sh@16 -- # local accel_opc 00:07:18.995 02:55:13 -- accel/accel.sh@17 -- # local accel_module 00:07:18.995 02:55:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:18.995 02:55:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:18.995 02:55:14 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.995 02:55:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.995 02:55:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.995 02:55:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.995 02:55:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.995 02:55:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.995 02:55:14 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.995 02:55:14 -- accel/accel.sh@42 -- # jq -r . 00:07:18.995 [2024-07-14 02:55:14.019589] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:18.995 [2024-07-14 02:55:14.019674] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid664465 ] 00:07:18.995 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.995 [2024-07-14 02:55:14.088314] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.995 [2024-07-14 02:55:14.123447] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.371 02:55:15 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:20.371 00:07:20.371 SPDK Configuration: 00:07:20.371 Core mask: 0x1 00:07:20.371 00:07:20.371 Accel Perf Configuration: 00:07:20.371 Workload Type: decompress 00:07:20.371 Transfer size: 4096 bytes 00:07:20.371 Vector count 1 00:07:20.371 Module: software 00:07:20.371 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:20.371 Queue depth: 32 00:07:20.371 Allocate depth: 32 00:07:20.371 # threads/core: 2 00:07:20.371 Run time: 1 seconds 00:07:20.371 Verify: Yes 00:07:20.371 00:07:20.371 Running for 1 seconds... 00:07:20.371 00:07:20.371 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:20.371 ------------------------------------------------------------------------------------ 00:07:20.371 0,1 47168/s 86 MiB/s 0 0 00:07:20.371 0,0 47040/s 86 MiB/s 0 0 00:07:20.371 ==================================================================================== 00:07:20.371 Total 94208/s 368 MiB/s 0 0' 00:07:20.371 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.371 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.371 02:55:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:20.371 02:55:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:20.371 02:55:15 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.372 02:55:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.372 02:55:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.372 02:55:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.372 02:55:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.372 02:55:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.372 02:55:15 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.372 02:55:15 -- accel/accel.sh@42 -- # jq -r . 00:07:20.372 [2024-07-14 02:55:15.308232] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:20.372 [2024-07-14 02:55:15.308322] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid664738 ] 00:07:20.372 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.372 [2024-07-14 02:55:15.376528] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.372 [2024-07-14 02:55:15.410558] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val= 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val= 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val= 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val=0x1 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val= 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val= 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val=decompress 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val= 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val=software 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@23 -- # accel_module=software 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val=32 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val=32 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val=2 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val=Yes 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val= 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:20.372 02:55:15 -- accel/accel.sh@21 -- # val= 00:07:20.372 02:55:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # IFS=: 00:07:20.372 02:55:15 -- accel/accel.sh@20 -- # read -r var val 00:07:21.749 02:55:16 -- accel/accel.sh@21 -- # val= 00:07:21.749 02:55:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # IFS=: 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # read -r var val 00:07:21.749 02:55:16 -- accel/accel.sh@21 -- # val= 00:07:21.749 02:55:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # IFS=: 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # read -r var val 00:07:21.749 02:55:16 -- accel/accel.sh@21 -- # val= 00:07:21.749 02:55:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # IFS=: 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # read -r var val 00:07:21.749 02:55:16 -- accel/accel.sh@21 -- # val= 00:07:21.749 02:55:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # IFS=: 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # read -r var val 00:07:21.749 02:55:16 -- accel/accel.sh@21 -- # val= 00:07:21.749 02:55:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # IFS=: 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # read -r var val 00:07:21.749 02:55:16 -- accel/accel.sh@21 -- # val= 00:07:21.749 02:55:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # IFS=: 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # read -r var val 00:07:21.749 02:55:16 -- accel/accel.sh@21 -- # val= 00:07:21.749 02:55:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # IFS=: 00:07:21.749 02:55:16 -- accel/accel.sh@20 -- # read -r var val 00:07:21.749 02:55:16 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:21.749 02:55:16 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:21.749 02:55:16 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:21.749 00:07:21.749 real 0m2.585s 00:07:21.749 user 0m2.332s 00:07:21.749 sys 0m0.262s 00:07:21.749 02:55:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.749 02:55:16 -- common/autotest_common.sh@10 -- # set +x 00:07:21.749 ************************************ 00:07:21.749 END TEST accel_decomp_mthread 00:07:21.749 ************************************ 00:07:21.749 02:55:16 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:21.749 02:55:16 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:21.749 02:55:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:21.749 02:55:16 -- common/autotest_common.sh@10 -- # set +x 00:07:21.749 ************************************ 00:07:21.749 START TEST accel_deomp_full_mthread 00:07:21.749 ************************************ 00:07:21.749 02:55:16 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:21.749 02:55:16 -- accel/accel.sh@16 -- # local accel_opc 00:07:21.749 02:55:16 -- accel/accel.sh@17 -- # local accel_module 00:07:21.749 02:55:16 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:21.749 02:55:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:21.749 02:55:16 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.749 02:55:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.749 02:55:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.749 02:55:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.749 02:55:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.749 02:55:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.749 02:55:16 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.749 02:55:16 -- accel/accel.sh@42 -- # jq -r . 00:07:21.749 [2024-07-14 02:55:16.653598] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:21.749 [2024-07-14 02:55:16.653700] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid665021 ] 00:07:21.749 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.749 [2024-07-14 02:55:16.723981] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.749 [2024-07-14 02:55:16.760505] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.126 02:55:17 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:23.126 00:07:23.126 SPDK Configuration: 00:07:23.126 Core mask: 0x1 00:07:23.126 00:07:23.126 Accel Perf Configuration: 00:07:23.126 Workload Type: decompress 00:07:23.126 Transfer size: 111250 bytes 00:07:23.126 Vector count 1 00:07:23.126 Module: software 00:07:23.126 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:23.126 Queue depth: 32 00:07:23.126 Allocate depth: 32 00:07:23.126 # threads/core: 2 00:07:23.126 Run time: 1 seconds 00:07:23.126 Verify: Yes 00:07:23.126 00:07:23.126 Running for 1 seconds... 00:07:23.126 00:07:23.126 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:23.126 ------------------------------------------------------------------------------------ 00:07:23.126 0,1 3008/s 124 MiB/s 0 0 00:07:23.126 0,0 2944/s 121 MiB/s 0 0 00:07:23.126 ==================================================================================== 00:07:23.126 Total 5952/s 631 MiB/s 0 0' 00:07:23.126 02:55:17 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:17 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:17 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:23.126 02:55:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:23.126 02:55:17 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.126 02:55:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:23.126 02:55:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.126 02:55:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.126 02:55:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:23.126 02:55:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:23.126 02:55:17 -- accel/accel.sh@41 -- # local IFS=, 00:07:23.126 02:55:17 -- accel/accel.sh@42 -- # jq -r . 00:07:23.126 [2024-07-14 02:55:17.967585] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:23.126 [2024-07-14 02:55:17.967675] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid665215 ] 00:07:23.126 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.126 [2024-07-14 02:55:18.036376] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.126 [2024-07-14 02:55:18.070796] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val= 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val= 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val= 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val=0x1 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val= 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val= 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val=decompress 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val= 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val=software 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@23 -- # accel_module=software 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val=32 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.126 02:55:18 -- accel/accel.sh@21 -- # val=32 00:07:23.126 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.126 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.127 02:55:18 -- accel/accel.sh@21 -- # val=2 00:07:23.127 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.127 02:55:18 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:23.127 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.127 02:55:18 -- accel/accel.sh@21 -- # val=Yes 00:07:23.127 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.127 02:55:18 -- accel/accel.sh@21 -- # val= 00:07:23.127 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:23.127 02:55:18 -- accel/accel.sh@21 -- # val= 00:07:23.127 02:55:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # IFS=: 00:07:23.127 02:55:18 -- accel/accel.sh@20 -- # read -r var val 00:07:24.063 02:55:19 -- accel/accel.sh@21 -- # val= 00:07:24.063 02:55:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.063 02:55:19 -- accel/accel.sh@20 -- # IFS=: 00:07:24.063 02:55:19 -- accel/accel.sh@20 -- # read -r var val 00:07:24.063 02:55:19 -- accel/accel.sh@21 -- # val= 00:07:24.063 02:55:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # IFS=: 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # read -r var val 00:07:24.064 02:55:19 -- accel/accel.sh@21 -- # val= 00:07:24.064 02:55:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # IFS=: 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # read -r var val 00:07:24.064 02:55:19 -- accel/accel.sh@21 -- # val= 00:07:24.064 02:55:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # IFS=: 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # read -r var val 00:07:24.064 02:55:19 -- accel/accel.sh@21 -- # val= 00:07:24.064 02:55:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # IFS=: 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # read -r var val 00:07:24.064 02:55:19 -- accel/accel.sh@21 -- # val= 00:07:24.064 02:55:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # IFS=: 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # read -r var val 00:07:24.064 02:55:19 -- accel/accel.sh@21 -- # val= 00:07:24.064 02:55:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # IFS=: 00:07:24.064 02:55:19 -- accel/accel.sh@20 -- # read -r var val 00:07:24.064 02:55:19 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:24.064 02:55:19 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:24.064 02:55:19 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:24.064 00:07:24.064 real 0m2.629s 00:07:24.064 user 0m2.371s 00:07:24.064 sys 0m0.266s 00:07:24.064 02:55:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.064 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.064 ************************************ 00:07:24.064 END TEST accel_deomp_full_mthread 00:07:24.064 ************************************ 00:07:24.064 02:55:19 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:24.064 02:55:19 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:24.064 02:55:19 -- accel/accel.sh@129 -- # build_accel_config 00:07:24.064 02:55:19 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:24.064 02:55:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:24.064 02:55:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:24.064 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.064 02:55:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.064 02:55:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.064 02:55:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:24.064 02:55:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:24.064 02:55:19 -- accel/accel.sh@41 -- # local IFS=, 00:07:24.064 02:55:19 -- accel/accel.sh@42 -- # jq -r . 00:07:24.064 ************************************ 00:07:24.064 START TEST accel_dif_functional_tests 00:07:24.064 ************************************ 00:07:24.064 02:55:19 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:24.324 [2024-07-14 02:55:19.334192] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:24.324 [2024-07-14 02:55:19.334291] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid665422 ] 00:07:24.324 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.324 [2024-07-14 02:55:19.403712] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:24.324 [2024-07-14 02:55:19.440800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.324 [2024-07-14 02:55:19.440890] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.324 [2024-07-14 02:55:19.440890] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.324 00:07:24.324 00:07:24.324 CUnit - A unit testing framework for C - Version 2.1-3 00:07:24.324 http://cunit.sourceforge.net/ 00:07:24.324 00:07:24.324 00:07:24.324 Suite: accel_dif 00:07:24.324 Test: verify: DIF generated, GUARD check ...passed 00:07:24.324 Test: verify: DIF generated, APPTAG check ...passed 00:07:24.324 Test: verify: DIF generated, REFTAG check ...passed 00:07:24.324 Test: verify: DIF not generated, GUARD check ...[2024-07-14 02:55:19.503560] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:24.324 [2024-07-14 02:55:19.503610] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:24.324 passed 00:07:24.324 Test: verify: DIF not generated, APPTAG check ...[2024-07-14 02:55:19.503647] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:24.324 [2024-07-14 02:55:19.503666] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:24.324 passed 00:07:24.324 Test: verify: DIF not generated, REFTAG check ...[2024-07-14 02:55:19.503685] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:24.324 [2024-07-14 02:55:19.503703] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:24.324 passed 00:07:24.324 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:24.324 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-14 02:55:19.503747] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:24.324 passed 00:07:24.324 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:24.324 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:24.324 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:24.324 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-14 02:55:19.503843] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:24.324 passed 00:07:24.324 Test: generate copy: DIF generated, GUARD check ...passed 00:07:24.324 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:24.324 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:24.324 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:24.324 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:24.324 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:24.324 Test: generate copy: iovecs-len validate ...[2024-07-14 02:55:19.504008] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:24.324 passed 00:07:24.324 Test: generate copy: buffer alignment validate ...passed 00:07:24.324 00:07:24.324 Run Summary: Type Total Ran Passed Failed Inactive 00:07:24.324 suites 1 1 n/a 0 0 00:07:24.324 tests 20 20 20 0 0 00:07:24.324 asserts 204 204 204 0 n/a 00:07:24.324 00:07:24.324 Elapsed time = 0.002 seconds 00:07:24.583 00:07:24.583 real 0m0.343s 00:07:24.583 user 0m0.523s 00:07:24.583 sys 0m0.161s 00:07:24.583 02:55:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.583 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.583 ************************************ 00:07:24.583 END TEST accel_dif_functional_tests 00:07:24.583 ************************************ 00:07:24.583 00:07:24.583 real 0m54.744s 00:07:24.583 user 1m2.447s 00:07:24.583 sys 0m6.833s 00:07:24.583 02:55:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.583 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.583 ************************************ 00:07:24.583 END TEST accel 00:07:24.583 ************************************ 00:07:24.583 02:55:19 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:24.583 02:55:19 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:24.583 02:55:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:24.583 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.583 ************************************ 00:07:24.583 START TEST accel_rpc 00:07:24.583 ************************************ 00:07:24.583 02:55:19 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:24.583 * Looking for test storage... 00:07:24.583 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:24.583 02:55:19 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:24.583 02:55:19 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=665644 00:07:24.583 02:55:19 -- accel/accel_rpc.sh@15 -- # waitforlisten 665644 00:07:24.583 02:55:19 -- common/autotest_common.sh@819 -- # '[' -z 665644 ']' 00:07:24.583 02:55:19 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:24.583 02:55:19 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:24.583 02:55:19 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:24.583 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:24.583 02:55:19 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:24.584 02:55:19 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:24.584 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.843 [2024-07-14 02:55:19.840082] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:24.843 [2024-07-14 02:55:19.840172] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid665644 ] 00:07:24.843 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.843 [2024-07-14 02:55:19.908772] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.843 [2024-07-14 02:55:19.946642] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:24.843 [2024-07-14 02:55:19.946752] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.843 02:55:19 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:24.843 02:55:19 -- common/autotest_common.sh@852 -- # return 0 00:07:24.843 02:55:19 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:24.843 02:55:19 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:24.843 02:55:19 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:24.843 02:55:19 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:24.843 02:55:19 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:24.843 02:55:19 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:24.843 02:55:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:24.843 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.843 ************************************ 00:07:24.843 START TEST accel_assign_opcode 00:07:24.843 ************************************ 00:07:24.843 02:55:19 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:07:24.843 02:55:19 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:24.843 02:55:19 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:24.843 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.843 [2024-07-14 02:55:19.979143] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:24.843 02:55:19 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:24.843 02:55:19 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:24.843 02:55:19 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:24.843 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:24.843 [2024-07-14 02:55:19.987152] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:24.843 02:55:19 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:24.843 02:55:19 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:24.843 02:55:19 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:24.843 02:55:19 -- common/autotest_common.sh@10 -- # set +x 00:07:25.102 02:55:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:25.102 02:55:20 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:25.102 02:55:20 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:25.102 02:55:20 -- accel/accel_rpc.sh@42 -- # grep software 00:07:25.102 02:55:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:25.102 02:55:20 -- common/autotest_common.sh@10 -- # set +x 00:07:25.102 02:55:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:25.102 software 00:07:25.102 00:07:25.102 real 0m0.227s 00:07:25.102 user 0m0.040s 00:07:25.102 sys 0m0.013s 00:07:25.102 02:55:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.102 02:55:20 -- common/autotest_common.sh@10 -- # set +x 00:07:25.102 ************************************ 00:07:25.102 END TEST accel_assign_opcode 00:07:25.102 ************************************ 00:07:25.102 02:55:20 -- accel/accel_rpc.sh@55 -- # killprocess 665644 00:07:25.102 02:55:20 -- common/autotest_common.sh@926 -- # '[' -z 665644 ']' 00:07:25.102 02:55:20 -- common/autotest_common.sh@930 -- # kill -0 665644 00:07:25.102 02:55:20 -- common/autotest_common.sh@931 -- # uname 00:07:25.102 02:55:20 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:25.102 02:55:20 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 665644 00:07:25.102 02:55:20 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:25.102 02:55:20 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:25.102 02:55:20 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 665644' 00:07:25.102 killing process with pid 665644 00:07:25.102 02:55:20 -- common/autotest_common.sh@945 -- # kill 665644 00:07:25.102 02:55:20 -- common/autotest_common.sh@950 -- # wait 665644 00:07:25.362 00:07:25.362 real 0m0.844s 00:07:25.362 user 0m0.734s 00:07:25.362 sys 0m0.410s 00:07:25.362 02:55:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.362 02:55:20 -- common/autotest_common.sh@10 -- # set +x 00:07:25.362 ************************************ 00:07:25.362 END TEST accel_rpc 00:07:25.362 ************************************ 00:07:25.621 02:55:20 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:25.621 02:55:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:25.621 02:55:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.621 02:55:20 -- common/autotest_common.sh@10 -- # set +x 00:07:25.621 ************************************ 00:07:25.621 START TEST app_cmdline 00:07:25.621 ************************************ 00:07:25.621 02:55:20 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:25.621 * Looking for test storage... 00:07:25.621 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:25.621 02:55:20 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:25.621 02:55:20 -- app/cmdline.sh@17 -- # spdk_tgt_pid=665758 00:07:25.621 02:55:20 -- app/cmdline.sh@18 -- # waitforlisten 665758 00:07:25.621 02:55:20 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:25.621 02:55:20 -- common/autotest_common.sh@819 -- # '[' -z 665758 ']' 00:07:25.621 02:55:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.621 02:55:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:25.621 02:55:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.621 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.621 02:55:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:25.621 02:55:20 -- common/autotest_common.sh@10 -- # set +x 00:07:25.621 [2024-07-14 02:55:20.762955] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:25.621 [2024-07-14 02:55:20.763025] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid665758 ] 00:07:25.621 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.621 [2024-07-14 02:55:20.832165] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.621 [2024-07-14 02:55:20.869199] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:25.621 [2024-07-14 02:55:20.869311] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.558 02:55:21 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:26.558 02:55:21 -- common/autotest_common.sh@852 -- # return 0 00:07:26.558 02:55:21 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:26.558 { 00:07:26.558 "version": "SPDK v24.01.1-pre git sha1 4b94202c6", 00:07:26.558 "fields": { 00:07:26.558 "major": 24, 00:07:26.558 "minor": 1, 00:07:26.558 "patch": 1, 00:07:26.558 "suffix": "-pre", 00:07:26.558 "commit": "4b94202c6" 00:07:26.558 } 00:07:26.558 } 00:07:26.558 02:55:21 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:26.558 02:55:21 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:26.558 02:55:21 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:26.558 02:55:21 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:26.558 02:55:21 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:26.558 02:55:21 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:26.558 02:55:21 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:26.558 02:55:21 -- common/autotest_common.sh@10 -- # set +x 00:07:26.558 02:55:21 -- app/cmdline.sh@26 -- # sort 00:07:26.558 02:55:21 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:26.558 02:55:21 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:26.558 02:55:21 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:26.558 02:55:21 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:26.558 02:55:21 -- common/autotest_common.sh@640 -- # local es=0 00:07:26.558 02:55:21 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:26.558 02:55:21 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:26.558 02:55:21 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:26.558 02:55:21 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:26.558 02:55:21 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:26.558 02:55:21 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:26.558 02:55:21 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:26.558 02:55:21 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:26.558 02:55:21 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:26.558 02:55:21 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:26.818 request: 00:07:26.818 { 00:07:26.818 "method": "env_dpdk_get_mem_stats", 00:07:26.818 "req_id": 1 00:07:26.818 } 00:07:26.818 Got JSON-RPC error response 00:07:26.818 response: 00:07:26.818 { 00:07:26.818 "code": -32601, 00:07:26.818 "message": "Method not found" 00:07:26.818 } 00:07:26.818 02:55:21 -- common/autotest_common.sh@643 -- # es=1 00:07:26.818 02:55:21 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:26.818 02:55:21 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:26.818 02:55:21 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:26.818 02:55:21 -- app/cmdline.sh@1 -- # killprocess 665758 00:07:26.818 02:55:21 -- common/autotest_common.sh@926 -- # '[' -z 665758 ']' 00:07:26.818 02:55:21 -- common/autotest_common.sh@930 -- # kill -0 665758 00:07:26.818 02:55:21 -- common/autotest_common.sh@931 -- # uname 00:07:26.818 02:55:21 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:26.818 02:55:21 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 665758 00:07:26.818 02:55:21 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:26.818 02:55:21 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:26.818 02:55:21 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 665758' 00:07:26.818 killing process with pid 665758 00:07:26.818 02:55:21 -- common/autotest_common.sh@945 -- # kill 665758 00:07:26.818 02:55:21 -- common/autotest_common.sh@950 -- # wait 665758 00:07:27.077 00:07:27.077 real 0m1.634s 00:07:27.077 user 0m1.858s 00:07:27.077 sys 0m0.479s 00:07:27.077 02:55:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.077 02:55:22 -- common/autotest_common.sh@10 -- # set +x 00:07:27.077 ************************************ 00:07:27.077 END TEST app_cmdline 00:07:27.077 ************************************ 00:07:27.077 02:55:22 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:27.077 02:55:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:27.077 02:55:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:27.077 02:55:22 -- common/autotest_common.sh@10 -- # set +x 00:07:27.077 ************************************ 00:07:27.077 START TEST version 00:07:27.077 ************************************ 00:07:27.077 02:55:22 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:27.336 * Looking for test storage... 00:07:27.336 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:27.336 02:55:22 -- app/version.sh@17 -- # get_header_version major 00:07:27.336 02:55:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:27.336 02:55:22 -- app/version.sh@14 -- # cut -f2 00:07:27.336 02:55:22 -- app/version.sh@14 -- # tr -d '"' 00:07:27.336 02:55:22 -- app/version.sh@17 -- # major=24 00:07:27.336 02:55:22 -- app/version.sh@18 -- # get_header_version minor 00:07:27.336 02:55:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:27.336 02:55:22 -- app/version.sh@14 -- # cut -f2 00:07:27.336 02:55:22 -- app/version.sh@14 -- # tr -d '"' 00:07:27.336 02:55:22 -- app/version.sh@18 -- # minor=1 00:07:27.336 02:55:22 -- app/version.sh@19 -- # get_header_version patch 00:07:27.336 02:55:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:27.336 02:55:22 -- app/version.sh@14 -- # cut -f2 00:07:27.336 02:55:22 -- app/version.sh@14 -- # tr -d '"' 00:07:27.336 02:55:22 -- app/version.sh@19 -- # patch=1 00:07:27.336 02:55:22 -- app/version.sh@20 -- # get_header_version suffix 00:07:27.336 02:55:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:27.336 02:55:22 -- app/version.sh@14 -- # cut -f2 00:07:27.336 02:55:22 -- app/version.sh@14 -- # tr -d '"' 00:07:27.336 02:55:22 -- app/version.sh@20 -- # suffix=-pre 00:07:27.336 02:55:22 -- app/version.sh@22 -- # version=24.1 00:07:27.336 02:55:22 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:27.336 02:55:22 -- app/version.sh@25 -- # version=24.1.1 00:07:27.336 02:55:22 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:27.336 02:55:22 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:27.336 02:55:22 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:27.336 02:55:22 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:27.336 02:55:22 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:27.336 00:07:27.336 real 0m0.177s 00:07:27.336 user 0m0.082s 00:07:27.336 sys 0m0.142s 00:07:27.336 02:55:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.336 02:55:22 -- common/autotest_common.sh@10 -- # set +x 00:07:27.336 ************************************ 00:07:27.336 END TEST version 00:07:27.336 ************************************ 00:07:27.336 02:55:22 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@204 -- # uname -s 00:07:27.336 02:55:22 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:07:27.336 02:55:22 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:27.336 02:55:22 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:27.336 02:55:22 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@268 -- # timing_exit lib 00:07:27.336 02:55:22 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:27.336 02:55:22 -- common/autotest_common.sh@10 -- # set +x 00:07:27.336 02:55:22 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:27.336 02:55:22 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:27.336 02:55:22 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:27.336 02:55:22 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:27.336 02:55:22 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:27.336 02:55:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:27.336 02:55:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:27.336 02:55:22 -- common/autotest_common.sh@10 -- # set +x 00:07:27.597 ************************************ 00:07:27.597 START TEST llvm_fuzz 00:07:27.597 ************************************ 00:07:27.597 02:55:22 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:27.597 * Looking for test storage... 00:07:27.597 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:27.597 02:55:22 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:27.597 02:55:22 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:27.597 02:55:22 -- common/autotest_common.sh@538 -- # fuzzers=() 00:07:27.597 02:55:22 -- common/autotest_common.sh@538 -- # local fuzzers 00:07:27.597 02:55:22 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:07:27.597 02:55:22 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:27.597 02:55:22 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:27.597 02:55:22 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:27.597 02:55:22 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:27.597 02:55:22 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:27.597 02:55:22 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:27.597 02:55:22 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:27.597 02:55:22 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:27.597 02:55:22 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:27.597 02:55:22 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:27.597 02:55:22 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:27.597 02:55:22 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:27.597 02:55:22 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:27.597 02:55:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:27.597 02:55:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:27.597 02:55:22 -- common/autotest_common.sh@10 -- # set +x 00:07:27.597 ************************************ 00:07:27.597 START TEST nvmf_fuzz 00:07:27.597 ************************************ 00:07:27.597 02:55:22 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:27.597 * Looking for test storage... 00:07:27.597 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:27.597 02:55:22 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:27.597 02:55:22 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:27.597 02:55:22 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:27.597 02:55:22 -- common/autotest_common.sh@34 -- # set -e 00:07:27.597 02:55:22 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:27.597 02:55:22 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:27.597 02:55:22 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:27.597 02:55:22 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:27.597 02:55:22 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:27.597 02:55:22 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:27.597 02:55:22 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:27.597 02:55:22 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:27.597 02:55:22 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:27.597 02:55:22 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:27.597 02:55:22 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:27.597 02:55:22 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:27.597 02:55:22 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:27.597 02:55:22 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:27.597 02:55:22 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:27.597 02:55:22 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:27.597 02:55:22 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:27.597 02:55:22 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:27.597 02:55:22 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:27.597 02:55:22 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:27.597 02:55:22 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:27.597 02:55:22 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:27.597 02:55:22 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:27.597 02:55:22 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:27.597 02:55:22 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:27.597 02:55:22 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:27.597 02:55:22 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:27.597 02:55:22 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:27.597 02:55:22 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:27.597 02:55:22 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:27.597 02:55:22 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:27.597 02:55:22 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:27.597 02:55:22 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:27.597 02:55:22 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:27.597 02:55:22 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:27.597 02:55:22 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:27.597 02:55:22 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:27.597 02:55:22 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:27.597 02:55:22 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:27.597 02:55:22 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:27.597 02:55:22 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:27.597 02:55:22 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:27.597 02:55:22 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:27.597 02:55:22 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:27.597 02:55:22 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:27.597 02:55:22 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:27.597 02:55:22 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:27.597 02:55:22 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:27.597 02:55:22 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:27.597 02:55:22 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:27.597 02:55:22 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:27.597 02:55:22 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:27.597 02:55:22 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:27.597 02:55:22 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:27.598 02:55:22 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:27.598 02:55:22 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:27.598 02:55:22 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:27.598 02:55:22 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:27.598 02:55:22 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:27.598 02:55:22 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:27.598 02:55:22 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:27.598 02:55:22 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:27.598 02:55:22 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:27.598 02:55:22 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:27.598 02:55:22 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:27.598 02:55:22 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:27.598 02:55:22 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:27.598 02:55:22 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:27.598 02:55:22 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:27.598 02:55:22 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:27.598 02:55:22 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:27.598 02:55:22 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:27.598 02:55:22 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:27.598 02:55:22 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:27.598 02:55:22 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:27.598 02:55:22 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:27.598 02:55:22 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:27.598 02:55:22 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:27.598 02:55:22 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:27.598 02:55:22 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:27.598 02:55:22 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:27.598 02:55:22 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:27.598 02:55:22 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:27.598 02:55:22 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:27.598 02:55:22 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:27.598 02:55:22 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:27.598 02:55:22 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:27.598 02:55:22 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:27.598 02:55:22 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:27.598 02:55:22 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:27.598 02:55:22 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:27.598 02:55:22 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:27.598 02:55:22 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:27.598 02:55:22 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:27.598 02:55:22 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:27.598 02:55:22 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:27.598 02:55:22 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:27.598 02:55:22 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:27.598 02:55:22 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:27.598 #define SPDK_CONFIG_H 00:07:27.598 #define SPDK_CONFIG_APPS 1 00:07:27.598 #define SPDK_CONFIG_ARCH native 00:07:27.598 #undef SPDK_CONFIG_ASAN 00:07:27.598 #undef SPDK_CONFIG_AVAHI 00:07:27.598 #undef SPDK_CONFIG_CET 00:07:27.598 #define SPDK_CONFIG_COVERAGE 1 00:07:27.598 #define SPDK_CONFIG_CROSS_PREFIX 00:07:27.598 #undef SPDK_CONFIG_CRYPTO 00:07:27.598 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:27.598 #undef SPDK_CONFIG_CUSTOMOCF 00:07:27.598 #undef SPDK_CONFIG_DAOS 00:07:27.598 #define SPDK_CONFIG_DAOS_DIR 00:07:27.598 #define SPDK_CONFIG_DEBUG 1 00:07:27.598 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:27.598 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:27.598 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:27.598 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:27.598 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:27.598 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:27.598 #define SPDK_CONFIG_EXAMPLES 1 00:07:27.598 #undef SPDK_CONFIG_FC 00:07:27.598 #define SPDK_CONFIG_FC_PATH 00:07:27.598 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:27.598 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:27.598 #undef SPDK_CONFIG_FUSE 00:07:27.598 #define SPDK_CONFIG_FUZZER 1 00:07:27.598 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:27.598 #undef SPDK_CONFIG_GOLANG 00:07:27.598 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:27.598 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:27.598 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:27.598 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:27.598 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:27.598 #define SPDK_CONFIG_IDXD 1 00:07:27.598 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:27.598 #undef SPDK_CONFIG_IPSEC_MB 00:07:27.598 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:27.598 #define SPDK_CONFIG_ISAL 1 00:07:27.598 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:27.598 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:27.598 #define SPDK_CONFIG_LIBDIR 00:07:27.598 #undef SPDK_CONFIG_LTO 00:07:27.598 #define SPDK_CONFIG_MAX_LCORES 00:07:27.598 #define SPDK_CONFIG_NVME_CUSE 1 00:07:27.598 #undef SPDK_CONFIG_OCF 00:07:27.598 #define SPDK_CONFIG_OCF_PATH 00:07:27.598 #define SPDK_CONFIG_OPENSSL_PATH 00:07:27.598 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:27.598 #undef SPDK_CONFIG_PGO_USE 00:07:27.598 #define SPDK_CONFIG_PREFIX /usr/local 00:07:27.598 #undef SPDK_CONFIG_RAID5F 00:07:27.598 #undef SPDK_CONFIG_RBD 00:07:27.598 #define SPDK_CONFIG_RDMA 1 00:07:27.598 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:27.598 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:27.598 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:27.598 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:27.598 #undef SPDK_CONFIG_SHARED 00:07:27.598 #undef SPDK_CONFIG_SMA 00:07:27.598 #define SPDK_CONFIG_TESTS 1 00:07:27.598 #undef SPDK_CONFIG_TSAN 00:07:27.598 #define SPDK_CONFIG_UBLK 1 00:07:27.598 #define SPDK_CONFIG_UBSAN 1 00:07:27.598 #undef SPDK_CONFIG_UNIT_TESTS 00:07:27.598 #undef SPDK_CONFIG_URING 00:07:27.598 #define SPDK_CONFIG_URING_PATH 00:07:27.598 #undef SPDK_CONFIG_URING_ZNS 00:07:27.598 #undef SPDK_CONFIG_USDT 00:07:27.598 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:27.598 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:27.598 #define SPDK_CONFIG_VFIO_USER 1 00:07:27.598 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:27.598 #define SPDK_CONFIG_VHOST 1 00:07:27.598 #define SPDK_CONFIG_VIRTIO 1 00:07:27.598 #undef SPDK_CONFIG_VTUNE 00:07:27.598 #define SPDK_CONFIG_VTUNE_DIR 00:07:27.598 #define SPDK_CONFIG_WERROR 1 00:07:27.598 #define SPDK_CONFIG_WPDK_DIR 00:07:27.598 #undef SPDK_CONFIG_XNVME 00:07:27.598 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:27.598 02:55:22 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:27.598 02:55:22 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:27.598 02:55:22 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:27.598 02:55:22 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:27.860 02:55:22 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:27.860 02:55:22 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:27.860 02:55:22 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:27.860 02:55:22 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:27.860 02:55:22 -- paths/export.sh@5 -- # export PATH 00:07:27.860 02:55:22 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:27.860 02:55:22 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:27.860 02:55:22 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:27.860 02:55:22 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:27.860 02:55:22 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:27.860 02:55:22 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:27.860 02:55:22 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:27.860 02:55:22 -- pm/common@16 -- # TEST_TAG=N/A 00:07:27.860 02:55:22 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:27.860 02:55:22 -- common/autotest_common.sh@52 -- # : 1 00:07:27.860 02:55:22 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:27.860 02:55:22 -- common/autotest_common.sh@56 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:27.860 02:55:22 -- common/autotest_common.sh@58 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:27.860 02:55:22 -- common/autotest_common.sh@60 -- # : 1 00:07:27.860 02:55:22 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:27.860 02:55:22 -- common/autotest_common.sh@62 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:27.860 02:55:22 -- common/autotest_common.sh@64 -- # : 00:07:27.860 02:55:22 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:27.860 02:55:22 -- common/autotest_common.sh@66 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:27.860 02:55:22 -- common/autotest_common.sh@68 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:27.860 02:55:22 -- common/autotest_common.sh@70 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:27.860 02:55:22 -- common/autotest_common.sh@72 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:27.860 02:55:22 -- common/autotest_common.sh@74 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:27.860 02:55:22 -- common/autotest_common.sh@76 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:27.860 02:55:22 -- common/autotest_common.sh@78 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:27.860 02:55:22 -- common/autotest_common.sh@80 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:27.860 02:55:22 -- common/autotest_common.sh@82 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:27.860 02:55:22 -- common/autotest_common.sh@84 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:27.860 02:55:22 -- common/autotest_common.sh@86 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:27.860 02:55:22 -- common/autotest_common.sh@88 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:27.860 02:55:22 -- common/autotest_common.sh@90 -- # : 0 00:07:27.860 02:55:22 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:27.860 02:55:22 -- common/autotest_common.sh@92 -- # : 1 00:07:27.860 02:55:22 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:27.860 02:55:22 -- common/autotest_common.sh@94 -- # : 1 00:07:27.860 02:55:22 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:27.860 02:55:22 -- common/autotest_common.sh@96 -- # : rdma 00:07:27.861 02:55:22 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:27.861 02:55:22 -- common/autotest_common.sh@98 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:27.861 02:55:22 -- common/autotest_common.sh@100 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:27.861 02:55:22 -- common/autotest_common.sh@102 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:27.861 02:55:22 -- common/autotest_common.sh@104 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:27.861 02:55:22 -- common/autotest_common.sh@106 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:27.861 02:55:22 -- common/autotest_common.sh@108 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:27.861 02:55:22 -- common/autotest_common.sh@110 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:27.861 02:55:22 -- common/autotest_common.sh@112 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:27.861 02:55:22 -- common/autotest_common.sh@114 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:27.861 02:55:22 -- common/autotest_common.sh@116 -- # : 1 00:07:27.861 02:55:22 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:27.861 02:55:22 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:27.861 02:55:22 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:27.861 02:55:22 -- common/autotest_common.sh@120 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:27.861 02:55:22 -- common/autotest_common.sh@122 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:27.861 02:55:22 -- common/autotest_common.sh@124 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:27.861 02:55:22 -- common/autotest_common.sh@126 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:27.861 02:55:22 -- common/autotest_common.sh@128 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:27.861 02:55:22 -- common/autotest_common.sh@130 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:27.861 02:55:22 -- common/autotest_common.sh@132 -- # : v22.11.4 00:07:27.861 02:55:22 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:27.861 02:55:22 -- common/autotest_common.sh@134 -- # : true 00:07:27.861 02:55:22 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:27.861 02:55:22 -- common/autotest_common.sh@136 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:27.861 02:55:22 -- common/autotest_common.sh@138 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:27.861 02:55:22 -- common/autotest_common.sh@140 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:27.861 02:55:22 -- common/autotest_common.sh@142 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:27.861 02:55:22 -- common/autotest_common.sh@144 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:27.861 02:55:22 -- common/autotest_common.sh@146 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:27.861 02:55:22 -- common/autotest_common.sh@148 -- # : 00:07:27.861 02:55:22 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:27.861 02:55:22 -- common/autotest_common.sh@150 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:27.861 02:55:22 -- common/autotest_common.sh@152 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:27.861 02:55:22 -- common/autotest_common.sh@154 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:27.861 02:55:22 -- common/autotest_common.sh@156 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:27.861 02:55:22 -- common/autotest_common.sh@158 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:27.861 02:55:22 -- common/autotest_common.sh@160 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:27.861 02:55:22 -- common/autotest_common.sh@163 -- # : 00:07:27.861 02:55:22 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:27.861 02:55:22 -- common/autotest_common.sh@165 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:27.861 02:55:22 -- common/autotest_common.sh@167 -- # : 0 00:07:27.861 02:55:22 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:27.861 02:55:22 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:27.861 02:55:22 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:27.861 02:55:22 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:27.861 02:55:22 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:27.861 02:55:22 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:27.861 02:55:22 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:27.861 02:55:22 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:27.861 02:55:22 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:27.861 02:55:22 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:27.861 02:55:22 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:27.861 02:55:22 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:27.861 02:55:22 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:27.861 02:55:22 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:27.861 02:55:22 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:27.861 02:55:22 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:27.861 02:55:22 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:27.861 02:55:22 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:27.861 02:55:22 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:27.861 02:55:22 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:27.861 02:55:22 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:27.861 02:55:22 -- common/autotest_common.sh@196 -- # cat 00:07:27.861 02:55:22 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:27.861 02:55:22 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:27.861 02:55:22 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:27.861 02:55:22 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:27.861 02:55:22 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:27.861 02:55:22 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:27.861 02:55:22 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:27.861 02:55:22 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:27.861 02:55:22 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:27.861 02:55:22 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:27.861 02:55:22 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:27.861 02:55:22 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:27.861 02:55:22 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:27.861 02:55:22 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:27.861 02:55:22 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:27.861 02:55:22 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:27.861 02:55:22 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:27.861 02:55:22 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:27.861 02:55:22 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:27.861 02:55:22 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:27.861 02:55:22 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:27.861 02:55:22 -- common/autotest_common.sh@249 -- # valgrind= 00:07:27.861 02:55:22 -- common/autotest_common.sh@255 -- # uname -s 00:07:27.861 02:55:22 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:27.862 02:55:22 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:27.862 02:55:22 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:27.862 02:55:22 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:27.862 02:55:22 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:27.862 02:55:22 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:27.862 02:55:22 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:27.862 02:55:22 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:27.862 02:55:22 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:27.862 02:55:22 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:27.862 02:55:22 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:27.862 02:55:22 -- common/autotest_common.sh@309 -- # [[ -z 666397 ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@309 -- # kill -0 666397 00:07:27.862 02:55:22 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:27.862 02:55:22 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:27.862 02:55:22 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:27.862 02:55:22 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:27.862 02:55:22 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:27.862 02:55:22 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:27.862 02:55:22 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:27.862 02:55:22 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.BmxX3h 00:07:27.862 02:55:22 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:27.862 02:55:22 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.BmxX3h/tests/nvmf /tmp/spdk.BmxX3h 00:07:27.862 02:55:22 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:27.862 02:55:22 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:27.862 02:55:22 -- common/autotest_common.sh@318 -- # df -T 00:07:27.862 02:55:22 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:27.862 02:55:22 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:27.862 02:55:22 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:27.862 02:55:22 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:07:27.862 02:55:22 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # avails["$mount"]=53091360768 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:07:27.862 02:55:22 -- common/autotest_common.sh@354 -- # uses["$mount"]=8650956800 00:07:27.862 02:55:22 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:27.862 02:55:22 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:07:27.862 02:55:22 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342484992 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:07:27.862 02:55:22 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:07:27.862 02:55:22 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870515712 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:27.862 02:55:22 -- common/autotest_common.sh@354 -- # uses["$mount"]=643072 00:07:27.862 02:55:22 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:07:27.862 02:55:22 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:07:27.862 02:55:22 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:07:27.862 02:55:22 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:27.862 02:55:22 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:27.862 * Looking for test storage... 00:07:27.862 02:55:22 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:27.862 02:55:22 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:27.862 02:55:22 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:27.862 02:55:22 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:27.862 02:55:22 -- common/autotest_common.sh@363 -- # mount=/ 00:07:27.862 02:55:22 -- common/autotest_common.sh@365 -- # target_space=53091360768 00:07:27.862 02:55:22 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:27.862 02:55:22 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:27.862 02:55:22 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@372 -- # new_size=10865549312 00:07:27.862 02:55:22 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:27.862 02:55:22 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:27.862 02:55:22 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:27.862 02:55:22 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:27.862 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:27.862 02:55:22 -- common/autotest_common.sh@380 -- # return 0 00:07:27.862 02:55:22 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:27.862 02:55:22 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:27.862 02:55:22 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:27.862 02:55:22 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:27.862 02:55:22 -- common/autotest_common.sh@1672 -- # true 00:07:27.862 02:55:22 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:27.862 02:55:22 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:27.862 02:55:22 -- common/autotest_common.sh@27 -- # exec 00:07:27.862 02:55:22 -- common/autotest_common.sh@29 -- # exec 00:07:27.862 02:55:22 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:27.862 02:55:22 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:27.862 02:55:22 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:27.862 02:55:22 -- common/autotest_common.sh@18 -- # set -x 00:07:27.862 02:55:22 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:27.862 02:55:22 -- ../common.sh@8 -- # pids=() 00:07:27.862 02:55:22 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:27.862 02:55:22 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:27.862 02:55:22 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:27.862 02:55:22 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:27.862 02:55:22 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:27.862 02:55:22 -- nvmf/run.sh@61 -- # mem_size=512 00:07:27.862 02:55:22 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:27.862 02:55:22 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:27.862 02:55:22 -- ../common.sh@69 -- # local fuzz_num=25 00:07:27.862 02:55:22 -- ../common.sh@70 -- # local time=1 00:07:27.862 02:55:22 -- ../common.sh@72 -- # (( i = 0 )) 00:07:27.862 02:55:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:27.862 02:55:22 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:27.862 02:55:22 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:27.862 02:55:22 -- nvmf/run.sh@24 -- # local timen=1 00:07:27.862 02:55:22 -- nvmf/run.sh@25 -- # local core=0x1 00:07:27.862 02:55:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:27.862 02:55:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:27.862 02:55:22 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:27.862 02:55:22 -- nvmf/run.sh@29 -- # port=4400 00:07:27.862 02:55:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:27.862 02:55:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:27.862 02:55:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:27.862 02:55:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:27.862 [2024-07-14 02:55:23.021937] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:27.862 [2024-07-14 02:55:23.022015] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid666435 ] 00:07:27.862 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.122 [2024-07-14 02:55:23.291348] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.122 [2024-07-14 02:55:23.319470] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:28.122 [2024-07-14 02:55:23.319596] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.122 [2024-07-14 02:55:23.370991] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:28.379 [2024-07-14 02:55:23.387281] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:28.379 INFO: Running with entropic power schedule (0xFF, 100). 00:07:28.379 INFO: Seed: 2059011203 00:07:28.379 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:28.379 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:28.379 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:28.379 INFO: A corpus is not provided, starting from an empty corpus 00:07:28.379 #2 INITED exec/s: 0 rss: 59Mb 00:07:28.379 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:28.379 This may also happen if the target rejected all inputs we tried so far 00:07:28.379 [2024-07-14 02:55:23.435799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.379 [2024-07-14 02:55:23.435829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.637 NEW_FUNC[1/671]: 0x491840 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:28.637 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:28.637 #10 NEW cov: 11488 ft: 11490 corp: 2/123b lim: 320 exec/s: 0 rss: 66Mb L: 122/122 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:28.637 [2024-07-14 02:55:23.746551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.637 [2024-07-14 02:55:23.746587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.637 #11 NEW cov: 11602 ft: 11849 corp: 3/245b lim: 320 exec/s: 0 rss: 66Mb L: 122/122 MS: 1 ChangeBit- 00:07:28.637 [2024-07-14 02:55:23.786654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.637 [2024-07-14 02:55:23.786685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.637 #12 NEW cov: 11608 ft: 12213 corp: 4/361b lim: 320 exec/s: 0 rss: 66Mb L: 116/122 MS: 1 EraseBytes- 00:07:28.637 [2024-07-14 02:55:23.826805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.637 [2024-07-14 02:55:23.826830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.637 [2024-07-14 02:55:23.826880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.637 [2024-07-14 02:55:23.826894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.637 #13 NEW cov: 11713 ft: 12616 corp: 5/491b lim: 320 exec/s: 0 rss: 66Mb L: 130/130 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\017"- 00:07:28.637 [2024-07-14 02:55:23.866890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.637 [2024-07-14 02:55:23.866916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.637 [2024-07-14 02:55:23.866964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.637 [2024-07-14 02:55:23.866978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.895 #14 NEW cov: 11713 ft: 12790 corp: 6/621b lim: 320 exec/s: 0 rss: 66Mb L: 130/130 MS: 1 ChangeByte- 00:07:28.895 [2024-07-14 02:55:23.907196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.895 [2024-07-14 02:55:23.907222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.896 [2024-07-14 02:55:23.907271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.896 [2024-07-14 02:55:23.907284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.896 [2024-07-14 02:55:23.907333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.896 [2024-07-14 02:55:23.907346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.896 #15 NEW cov: 11713 ft: 13008 corp: 7/819b lim: 320 exec/s: 0 rss: 66Mb L: 198/198 MS: 1 CrossOver- 00:07:28.896 [2024-07-14 02:55:23.947259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.896 [2024-07-14 02:55:23.947286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.896 [2024-07-14 02:55:23.947335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000040 00:07:28.896 [2024-07-14 02:55:23.947349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.896 [2024-07-14 02:55:23.947397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.896 [2024-07-14 02:55:23.947411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.896 #21 NEW cov: 11713 ft: 13048 corp: 8/1017b lim: 320 exec/s: 0 rss: 67Mb L: 198/198 MS: 1 ChangeByte- 00:07:28.896 [2024-07-14 02:55:23.987178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.896 [2024-07-14 02:55:23.987204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.896 #22 NEW cov: 11713 ft: 13218 corp: 9/1139b lim: 320 exec/s: 0 rss: 67Mb L: 122/198 MS: 1 ChangeBit- 00:07:28.896 [2024-07-14 02:55:24.017244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.896 [2024-07-14 02:55:24.017270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.896 #23 NEW cov: 11713 ft: 13300 corp: 10/1261b lim: 320 exec/s: 0 rss: 67Mb L: 122/198 MS: 1 ShuffleBytes- 00:07:28.896 [2024-07-14 02:55:24.057363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.896 [2024-07-14 02:55:24.057392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.896 #24 NEW cov: 11713 ft: 13407 corp: 11/1377b lim: 320 exec/s: 0 rss: 67Mb L: 116/198 MS: 1 CMP- DE: "\001\000\000\000\000\000u0"- 00:07:28.896 [2024-07-14 02:55:24.097636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.896 [2024-07-14 02:55:24.097663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.896 [2024-07-14 02:55:24.097718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (75) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.896 [2024-07-14 02:55:24.097732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.896 NEW_FUNC[1/1]: 0x16ee020 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:28.896 #25 NEW cov: 11727 ft: 13753 corp: 12/1507b lim: 320 exec/s: 0 rss: 67Mb L: 130/198 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000u0"- 00:07:28.896 [2024-07-14 02:55:24.137608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.896 [2024-07-14 02:55:24.137634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.155 #26 NEW cov: 11727 ft: 13771 corp: 13/1623b lim: 320 exec/s: 0 rss: 67Mb L: 116/198 MS: 1 ChangeBinInt- 00:07:29.155 [2024-07-14 02:55:24.177749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.155 [2024-07-14 02:55:24.177774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.155 #27 NEW cov: 11727 ft: 13797 corp: 14/1745b lim: 320 exec/s: 0 rss: 67Mb L: 122/198 MS: 1 CopyPart- 00:07:29.155 [2024-07-14 02:55:24.207856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.155 [2024-07-14 02:55:24.207881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.155 #28 NEW cov: 11727 ft: 13867 corp: 15/1867b lim: 320 exec/s: 0 rss: 67Mb L: 122/198 MS: 1 ShuffleBytes- 00:07:29.155 [2024-07-14 02:55:24.247950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.155 [2024-07-14 02:55:24.247976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.155 #29 NEW cov: 11727 ft: 13891 corp: 16/1983b lim: 320 exec/s: 0 rss: 67Mb L: 116/198 MS: 1 ChangeBinInt- 00:07:29.155 [2024-07-14 02:55:24.288094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.155 [2024-07-14 02:55:24.288120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.155 #32 NEW cov: 11727 ft: 13910 corp: 17/2075b lim: 320 exec/s: 0 rss: 67Mb L: 92/198 MS: 3 ChangeByte-ShuffleBytes-CrossOver- 00:07:29.155 [2024-07-14 02:55:24.328173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.155 [2024-07-14 02:55:24.328199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.155 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:29.155 #33 NEW cov: 11750 ft: 13958 corp: 18/2191b lim: 320 exec/s: 0 rss: 67Mb L: 116/198 MS: 1 ShuffleBytes- 00:07:29.155 [2024-07-14 02:55:24.368503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.155 [2024-07-14 02:55:24.368529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.155 [2024-07-14 02:55:24.368583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (75) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.155 [2024-07-14 02:55:24.368598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.155 #34 NEW cov: 11750 ft: 13977 corp: 19/2321b lim: 320 exec/s: 0 rss: 67Mb L: 130/198 MS: 1 ChangeBit- 00:07:29.414 [2024-07-14 02:55:24.408608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.414 [2024-07-14 02:55:24.408634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 [2024-07-14 02:55:24.408691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (69) qid:0 cid:5 nsid:69696969 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-07-14 02:55:24.408705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.414 #35 NEW cov: 11750 ft: 14057 corp: 20/2471b lim: 320 exec/s: 35 rss: 67Mb L: 150/198 MS: 1 InsertRepeatedBytes- 00:07:29.414 [2024-07-14 02:55:24.448796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.414 [2024-07-14 02:55:24.448821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 [2024-07-14 02:55:24.448869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.414 [2024-07-14 02:55:24.448882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.414 [2024-07-14 02:55:24.448929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.414 [2024-07-14 02:55:24.448942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.414 #36 NEW cov: 11750 ft: 14063 corp: 21/2669b lim: 320 exec/s: 36 rss: 67Mb L: 198/198 MS: 1 CopyPart- 00:07:29.414 [2024-07-14 02:55:24.488692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2000 00:07:29.414 [2024-07-14 02:55:24.488717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 #37 NEW cov: 11750 ft: 14118 corp: 22/2791b lim: 320 exec/s: 37 rss: 68Mb L: 122/198 MS: 1 ChangeByte- 00:07:29.414 [2024-07-14 02:55:24.518756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.414 [2024-07-14 02:55:24.518781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 #38 NEW cov: 11750 ft: 14136 corp: 23/2907b lim: 320 exec/s: 38 rss: 68Mb L: 116/198 MS: 1 ChangeBit- 00:07:29.414 [2024-07-14 02:55:24.559054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.414 [2024-07-14 02:55:24.559079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 [2024-07-14 02:55:24.559139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (69) qid:0 cid:5 nsid:69696969 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-07-14 02:55:24.559153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.414 #39 NEW cov: 11750 ft: 14172 corp: 24/3058b lim: 320 exec/s: 39 rss: 68Mb L: 151/198 MS: 1 InsertByte- 00:07:29.414 [2024-07-14 02:55:24.599011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.414 [2024-07-14 02:55:24.599037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 #40 NEW cov: 11750 ft: 14180 corp: 25/3161b lim: 320 exec/s: 40 rss: 68Mb L: 103/198 MS: 1 EraseBytes- 00:07:29.414 [2024-07-14 02:55:24.639102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.414 [2024-07-14 02:55:24.639127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 #41 NEW cov: 11750 ft: 14218 corp: 26/3283b lim: 320 exec/s: 41 rss: 68Mb L: 122/198 MS: 1 CrossOver- 00:07:29.672 [2024-07-14 02:55:24.679417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.672 [2024-07-14 02:55:24.679447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.672 [2024-07-14 02:55:24.679507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (75) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.672 [2024-07-14 02:55:24.679521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.672 #42 NEW cov: 11750 ft: 14231 corp: 27/3414b lim: 320 exec/s: 42 rss: 68Mb L: 131/198 MS: 1 InsertByte- 00:07:29.672 [2024-07-14 02:55:24.719305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.672 [2024-07-14 02:55:24.719331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.672 #43 NEW cov: 11750 ft: 14270 corp: 28/3536b lim: 320 exec/s: 43 rss: 68Mb L: 122/198 MS: 1 ChangeBinInt- 00:07:29.673 [2024-07-14 02:55:24.759460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.673 [2024-07-14 02:55:24.759485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.673 #44 NEW cov: 11750 ft: 14291 corp: 29/3658b lim: 320 exec/s: 44 rss: 68Mb L: 122/198 MS: 1 ShuffleBytes- 00:07:29.673 [2024-07-14 02:55:24.789651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.673 [2024-07-14 02:55:24.789677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.673 [2024-07-14 02:55:24.789726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.673 [2024-07-14 02:55:24.789739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.673 #45 NEW cov: 11750 ft: 14305 corp: 30/3788b lim: 320 exec/s: 45 rss: 68Mb L: 130/198 MS: 1 CMP- DE: "\001)\355\213\032\026\265X"- 00:07:29.673 [2024-07-14 02:55:24.829904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.673 [2024-07-14 02:55:24.829929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.673 [2024-07-14 02:55:24.829988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (69) qid:0 cid:5 nsid:69696969 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.673 [2024-07-14 02:55:24.830002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.673 [2024-07-14 02:55:24.830051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.673 [2024-07-14 02:55:24.830065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.673 #46 NEW cov: 11750 ft: 14348 corp: 31/4042b lim: 320 exec/s: 46 rss: 68Mb L: 254/254 MS: 1 CrossOver- 00:07:29.673 [2024-07-14 02:55:24.869862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.673 [2024-07-14 02:55:24.869886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.673 [2024-07-14 02:55:24.869936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.673 [2024-07-14 02:55:24.869949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.673 #47 NEW cov: 11750 ft: 14353 corp: 32/4173b lim: 320 exec/s: 47 rss: 68Mb L: 131/254 MS: 1 InsertByte- 00:07:29.673 [2024-07-14 02:55:24.909982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x80ffffffffffffff 00:07:29.673 [2024-07-14 02:55:24.910008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.673 [2024-07-14 02:55:24.910059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.673 [2024-07-14 02:55:24.910072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.931 #48 NEW cov: 11750 ft: 14367 corp: 33/4316b lim: 320 exec/s: 48 rss: 68Mb L: 143/254 MS: 1 InsertRepeatedBytes- 00:07:29.931 [2024-07-14 02:55:24.950002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.931 [2024-07-14 02:55:24.950028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.931 #49 NEW cov: 11750 ft: 14388 corp: 34/4433b lim: 320 exec/s: 49 rss: 68Mb L: 117/254 MS: 1 InsertByte- 00:07:29.931 [2024-07-14 02:55:24.980124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000029 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.931 [2024-07-14 02:55:24.980149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.931 #50 NEW cov: 11750 ft: 14399 corp: 35/4549b lim: 320 exec/s: 50 rss: 68Mb L: 116/254 MS: 1 ChangeByte- 00:07:29.931 [2024-07-14 02:55:25.020360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.931 [2024-07-14 02:55:25.020385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.931 [2024-07-14 02:55:25.020447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (75) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.931 [2024-07-14 02:55:25.020461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.931 #51 NEW cov: 11750 ft: 14420 corp: 36/4680b lim: 320 exec/s: 51 rss: 68Mb L: 131/254 MS: 1 ChangeByte- 00:07:29.932 [2024-07-14 02:55:25.060355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000029 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.932 [2024-07-14 02:55:25.060381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.932 #52 NEW cov: 11750 ft: 14427 corp: 37/4796b lim: 320 exec/s: 52 rss: 68Mb L: 116/254 MS: 1 ChangeBinInt- 00:07:29.932 [2024-07-14 02:55:25.100448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:0000292c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.932 [2024-07-14 02:55:25.100473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.932 #53 NEW cov: 11750 ft: 14477 corp: 38/4913b lim: 320 exec/s: 53 rss: 69Mb L: 117/254 MS: 1 InsertByte- 00:07:29.932 [2024-07-14 02:55:25.130686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.932 [2024-07-14 02:55:25.130711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.932 [2024-07-14 02:55:25.130769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (75) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.932 [2024-07-14 02:55:25.130784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.932 #54 NEW cov: 11750 ft: 14482 corp: 39/5043b lim: 320 exec/s: 54 rss: 69Mb L: 130/254 MS: 1 ChangeBit- 00:07:29.932 [2024-07-14 02:55:25.170874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.932 [2024-07-14 02:55:25.170900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.932 [2024-07-14 02:55:25.170949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.932 [2024-07-14 02:55:25.170963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.932 [2024-07-14 02:55:25.171014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.932 [2024-07-14 02:55:25.171027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.191 #55 NEW cov: 11750 ft: 14484 corp: 40/5238b lim: 320 exec/s: 55 rss: 69Mb L: 195/254 MS: 1 CopyPart- 00:07:30.191 [2024-07-14 02:55:25.200857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.191 [2024-07-14 02:55:25.200884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.191 [2024-07-14 02:55:25.200934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.191 [2024-07-14 02:55:25.200947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.191 #56 NEW cov: 11750 ft: 14538 corp: 41/5369b lim: 320 exec/s: 56 rss: 69Mb L: 131/254 MS: 1 ChangeBinInt- 00:07:30.191 [2024-07-14 02:55:25.240957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.191 [2024-07-14 02:55:25.240984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.191 [2024-07-14 02:55:25.241037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.191 [2024-07-14 02:55:25.241051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.191 #57 NEW cov: 11750 ft: 14553 corp: 42/5499b lim: 320 exec/s: 57 rss: 69Mb L: 130/254 MS: 1 ChangeBinInt- 00:07:30.191 [2024-07-14 02:55:25.280978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.191 [2024-07-14 02:55:25.281005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.191 #58 NEW cov: 11750 ft: 14566 corp: 43/5565b lim: 320 exec/s: 58 rss: 69Mb L: 66/254 MS: 1 EraseBytes- 00:07:30.191 [2024-07-14 02:55:25.311312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.191 [2024-07-14 02:55:25.311338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.191 [2024-07-14 02:55:25.311395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (75) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.191 [2024-07-14 02:55:25.311409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.191 [2024-07-14 02:55:25.311469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.191 [2024-07-14 02:55:25.311484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.191 #59 NEW cov: 11750 ft: 14628 corp: 44/5792b lim: 320 exec/s: 59 rss: 69Mb L: 227/254 MS: 1 InsertRepeatedBytes- 00:07:30.191 [2024-07-14 02:55:25.351299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff000000 00:07:30.191 [2024-07-14 02:55:25.351325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.191 [2024-07-14 02:55:25.351374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.191 [2024-07-14 02:55:25.351388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.191 #60 NEW cov: 11750 ft: 14635 corp: 45/5943b lim: 320 exec/s: 60 rss: 69Mb L: 151/254 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\017"- 00:07:30.191 [2024-07-14 02:55:25.391529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.191 [2024-07-14 02:55:25.391556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.191 [2024-07-14 02:55:25.391607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.191 [2024-07-14 02:55:25.391621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.191 [2024-07-14 02:55:25.391668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:18180000 cdw10:18181818 cdw11:18181818 00:07:30.191 [2024-07-14 02:55:25.391683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.191 #61 NEW cov: 11750 ft: 14643 corp: 46/6192b lim: 320 exec/s: 61 rss: 70Mb L: 249/254 MS: 1 InsertRepeatedBytes- 00:07:30.191 [2024-07-14 02:55:25.421389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (a7) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.191 [2024-07-14 02:55:25.421415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.191 #62 NEW cov: 11750 ft: 14675 corp: 47/6308b lim: 320 exec/s: 31 rss: 70Mb L: 116/254 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\017"- 00:07:30.191 #62 DONE cov: 11750 ft: 14675 corp: 47/6308b lim: 320 exec/s: 31 rss: 70Mb 00:07:30.191 ###### Recommended dictionary. ###### 00:07:30.191 "\001\000\000\000\000\000\000\017" # Uses: 3 00:07:30.191 "\001\000\000\000\000\000u0" # Uses: 1 00:07:30.191 "\001)\355\213\032\026\265X" # Uses: 0 00:07:30.191 ###### End of recommended dictionary. ###### 00:07:30.191 Done 62 runs in 2 second(s) 00:07:30.451 02:55:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:30.451 02:55:25 -- ../common.sh@72 -- # (( i++ )) 00:07:30.451 02:55:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:30.451 02:55:25 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:30.451 02:55:25 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:30.451 02:55:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:30.451 02:55:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:30.451 02:55:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:30.451 02:55:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:30.451 02:55:25 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:30.451 02:55:25 -- nvmf/run.sh@29 -- # port=4401 00:07:30.451 02:55:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:30.451 02:55:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:30.451 02:55:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:30.451 02:55:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:30.451 [2024-07-14 02:55:25.598394] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:30.451 [2024-07-14 02:55:25.598504] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid666922 ] 00:07:30.451 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.710 [2024-07-14 02:55:25.851182] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.710 [2024-07-14 02:55:25.880132] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:30.710 [2024-07-14 02:55:25.880255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.710 [2024-07-14 02:55:25.931622] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:30.710 [2024-07-14 02:55:25.947912] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:30.969 INFO: Running with entropic power schedule (0xFF, 100). 00:07:30.969 INFO: Seed: 327049501 00:07:30.969 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:30.969 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:30.969 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:30.969 INFO: A corpus is not provided, starting from an empty corpus 00:07:30.969 #2 INITED exec/s: 0 rss: 59Mb 00:07:30.969 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:30.969 This may also happen if the target rejected all inputs we tried so far 00:07:30.969 [2024-07-14 02:55:26.003030] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:30.969 [2024-07-14 02:55:26.003145] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:30.969 [2024-07-14 02:55:26.003249] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:30.969 [2024-07-14 02:55:26.003352] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:30.969 [2024-07-14 02:55:26.003560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.969 [2024-07-14 02:55:26.003594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.969 [2024-07-14 02:55:26.003642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.969 [2024-07-14 02:55:26.003657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.969 [2024-07-14 02:55:26.003708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.969 [2024-07-14 02:55:26.003722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.969 [2024-07-14 02:55:26.003773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.969 [2024-07-14 02:55:26.003787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.228 NEW_FUNC[1/671]: 0x492140 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:31.228 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:31.228 #5 NEW cov: 11553 ft: 11551 corp: 2/30b lim: 30 exec/s: 0 rss: 66Mb L: 29/29 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:31.228 [2024-07-14 02:55:26.313790] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.228 [2024-07-14 02:55:26.313922] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.228 [2024-07-14 02:55:26.314143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.228 [2024-07-14 02:55:26.314177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.228 [2024-07-14 02:55:26.314232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.228 [2024-07-14 02:55:26.314248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.228 #16 NEW cov: 11666 ft: 12627 corp: 3/45b lim: 30 exec/s: 0 rss: 66Mb L: 15/29 MS: 1 InsertRepeatedBytes- 00:07:31.228 [2024-07-14 02:55:26.353829] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.228 [2024-07-14 02:55:26.353945] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.228 [2024-07-14 02:55:26.354145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.228 [2024-07-14 02:55:26.354172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.228 [2024-07-14 02:55:26.354227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.228 [2024-07-14 02:55:26.354242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.228 #22 NEW cov: 11672 ft: 12874 corp: 4/60b lim: 30 exec/s: 0 rss: 66Mb L: 15/29 MS: 1 ShuffleBytes- 00:07:31.228 [2024-07-14 02:55:26.393972] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.228 [2024-07-14 02:55:26.394087] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:31.228 [2024-07-14 02:55:26.394195] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.228 [2024-07-14 02:55:26.394410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.228 [2024-07-14 02:55:26.394440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.228 [2024-07-14 02:55:26.394500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.228 [2024-07-14 02:55:26.394515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.228 [2024-07-14 02:55:26.394570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.228 [2024-07-14 02:55:26.394584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.228 #23 NEW cov: 11757 ft: 13371 corp: 5/79b lim: 30 exec/s: 0 rss: 66Mb L: 19/29 MS: 1 InsertRepeatedBytes- 00:07:31.228 [2024-07-14 02:55:26.434043] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.228 [2024-07-14 02:55:26.434160] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.228 [2024-07-14 02:55:26.434366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.228 [2024-07-14 02:55:26.434393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.229 [2024-07-14 02:55:26.434454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.229 [2024-07-14 02:55:26.434469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.229 #24 NEW cov: 11757 ft: 13475 corp: 6/94b lim: 30 exec/s: 0 rss: 66Mb L: 15/29 MS: 1 ShuffleBytes- 00:07:31.229 [2024-07-14 02:55:26.474205] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.229 [2024-07-14 02:55:26.474320] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:31.229 [2024-07-14 02:55:26.474426] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.229 [2024-07-14 02:55:26.474651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.229 [2024-07-14 02:55:26.474679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.229 [2024-07-14 02:55:26.474732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.229 [2024-07-14 02:55:26.474747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.229 [2024-07-14 02:55:26.474800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.229 [2024-07-14 02:55:26.474814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.558 #25 NEW cov: 11757 ft: 13611 corp: 7/113b lim: 30 exec/s: 0 rss: 66Mb L: 19/29 MS: 1 ChangeBit- 00:07:31.558 [2024-07-14 02:55:26.514322] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.514438] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:31.558 [2024-07-14 02:55:26.514570] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.514779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.514810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.558 [2024-07-14 02:55:26.514868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.514883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.558 [2024-07-14 02:55:26.514939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928303 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.514954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.558 #26 NEW cov: 11757 ft: 13670 corp: 8/132b lim: 30 exec/s: 0 rss: 66Mb L: 19/29 MS: 1 ChangeByte- 00:07:31.558 [2024-07-14 02:55:26.554406] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:07:31.558 [2024-07-14 02:55:26.554529] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.554733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.554760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.558 [2024-07-14 02:55:26.554815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.554831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.558 #32 NEW cov: 11780 ft: 13759 corp: 9/147b lim: 30 exec/s: 0 rss: 67Mb L: 15/29 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:31.558 [2024-07-14 02:55:26.594548] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.594663] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:31.558 [2024-07-14 02:55:26.594770] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.594968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.594996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.558 [2024-07-14 02:55:26.595049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.595064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.558 [2024-07-14 02:55:26.595118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.595132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.558 #33 NEW cov: 11780 ft: 13807 corp: 10/166b lim: 30 exec/s: 0 rss: 67Mb L: 19/29 MS: 1 ShuffleBytes- 00:07:31.558 [2024-07-14 02:55:26.634599] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.634714] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.634931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.634957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.558 [2024-07-14 02:55:26.635011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.635029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.558 #34 NEW cov: 11780 ft: 13901 corp: 11/181b lim: 30 exec/s: 0 rss: 67Mb L: 15/29 MS: 1 ShuffleBytes- 00:07:31.558 [2024-07-14 02:55:26.674732] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.674850] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.675055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.675084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.558 [2024-07-14 02:55:26.675141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7fff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.558 [2024-07-14 02:55:26.675156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.558 #35 NEW cov: 11780 ft: 13991 corp: 12/196b lim: 30 exec/s: 0 rss: 67Mb L: 15/29 MS: 1 ChangeBit- 00:07:31.558 [2024-07-14 02:55:26.714939] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.715056] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.715164] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.715269] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.558 [2024-07-14 02:55:26.715472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.559 [2024-07-14 02:55:26.715500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.559 [2024-07-14 02:55:26.715554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.559 [2024-07-14 02:55:26.715570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.559 [2024-07-14 02:55:26.715625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.559 [2024-07-14 02:55:26.715641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.559 [2024-07-14 02:55:26.715696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.559 [2024-07-14 02:55:26.715712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.559 #36 NEW cov: 11780 ft: 14019 corp: 13/225b lim: 30 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CopyPart- 00:07:31.854 [2024-07-14 02:55:26.754981] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:07:31.854 [2024-07-14 02:55:26.755100] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.854 [2024-07-14 02:55:26.755309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.755338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:26.755395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.755413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 #37 NEW cov: 11780 ft: 14033 corp: 14/240b lim: 30 exec/s: 0 rss: 67Mb L: 15/29 MS: 1 ShuffleBytes- 00:07:31.855 [2024-07-14 02:55:26.795106] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:26.795222] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4ff 00:07:31.855 [2024-07-14 02:55:26.795437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.795470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:26.795525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7fff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.795540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 #38 NEW cov: 11780 ft: 14054 corp: 15/255b lim: 30 exec/s: 0 rss: 67Mb L: 15/29 MS: 1 CMP- DE: "\000\004"- 00:07:31.855 [2024-07-14 02:55:26.835184] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:26.835301] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:26.835513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.835541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:26.835595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7fff837a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.835609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 #39 NEW cov: 11780 ft: 14070 corp: 16/271b lim: 30 exec/s: 0 rss: 67Mb L: 16/29 MS: 1 InsertByte- 00:07:31.855 [2024-07-14 02:55:26.875305] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff01 00:07:31.855 [2024-07-14 02:55:26.875625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.875653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:26.875709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.875723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:31.855 #40 NEW cov: 11820 ft: 14143 corp: 17/286b lim: 30 exec/s: 0 rss: 67Mb L: 15/29 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:31.855 [2024-07-14 02:55:26.915398] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:26.915521] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:26.915724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.915752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:26.915808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7ff783ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.915826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 #41 NEW cov: 11820 ft: 14152 corp: 18/301b lim: 30 exec/s: 0 rss: 67Mb L: 15/29 MS: 1 ChangeBit- 00:07:31.855 [2024-07-14 02:55:26.955696] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:26.955816] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:31.855 [2024-07-14 02:55:26.955922] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:26.956126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.956154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:26.956211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.956227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:26.956283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928303 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.956297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.855 #42 NEW cov: 11820 ft: 14208 corp: 19/320b lim: 30 exec/s: 42 rss: 67Mb L: 19/29 MS: 1 ShuffleBytes- 00:07:31.855 [2024-07-14 02:55:26.995778] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:07:31.855 [2024-07-14 02:55:26.995894] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:31.855 [2024-07-14 02:55:26.996097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.996124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:26.996182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:26.996198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 #43 NEW cov: 11820 ft: 14222 corp: 20/335b lim: 30 exec/s: 43 rss: 67Mb L: 15/29 MS: 1 ChangeBinInt- 00:07:31.855 [2024-07-14 02:55:27.035924] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:31.855 [2024-07-14 02:55:27.036038] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:27.036241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:27.036269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:27.036325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:92928303 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:27.036340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 #44 NEW cov: 11820 ft: 14265 corp: 21/348b lim: 30 exec/s: 44 rss: 67Mb L: 13/29 MS: 1 EraseBytes- 00:07:31.855 [2024-07-14 02:55:27.076065] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:07:31.855 [2024-07-14 02:55:27.076182] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:31.855 [2024-07-14 02:55:27.076389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:27.076420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.855 [2024-07-14 02:55:27.076486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.855 [2024-07-14 02:55:27.076502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.855 #45 NEW cov: 11820 ft: 14295 corp: 22/365b lim: 30 exec/s: 45 rss: 67Mb L: 17/29 MS: 1 PersAutoDict- DE: "\000\004"- 00:07:32.115 [2024-07-14 02:55:27.116200] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.116314] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.116422] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009292 00:07:32.115 [2024-07-14 02:55:27.116635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.116662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.116718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.116734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.116790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.116804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.115 #46 NEW cov: 11820 ft: 14296 corp: 23/384b lim: 30 exec/s: 46 rss: 68Mb L: 19/29 MS: 1 CopyPart- 00:07:32.115 [2024-07-14 02:55:27.156261] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.156377] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x33ff 00:07:32.115 [2024-07-14 02:55:27.156602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.156629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.156685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7fff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.156700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.115 #47 NEW cov: 11820 ft: 14316 corp: 24/399b lim: 30 exec/s: 47 rss: 68Mb L: 15/29 MS: 1 ChangeByte- 00:07:32.115 [2024-07-14 02:55:27.196529] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.196644] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.196752] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:32.115 [2024-07-14 02:55:27.196855] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.197075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.197102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.197160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.197178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.197235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.197250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.197306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:92928303 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.197321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.115 #48 NEW cov: 11820 ft: 14336 corp: 25/424b lim: 30 exec/s: 48 rss: 68Mb L: 25/29 MS: 1 InsertRepeatedBytes- 00:07:32.115 [2024-07-14 02:55:27.236524] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.236639] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.236860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.236888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.236945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.236960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.115 #49 NEW cov: 11820 ft: 14370 corp: 26/439b lim: 30 exec/s: 49 rss: 68Mb L: 15/29 MS: 1 ShuffleBytes- 00:07:32.115 [2024-07-14 02:55:27.276718] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.276834] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:32.115 [2024-07-14 02:55:27.276944] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.277165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.277192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.277249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.277264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.115 [2024-07-14 02:55:27.277318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.277333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.115 #50 NEW cov: 11820 ft: 14412 corp: 27/458b lim: 30 exec/s: 50 rss: 68Mb L: 19/29 MS: 1 ShuffleBytes- 00:07:32.115 [2024-07-14 02:55:27.316885] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000006ff 00:07:32.115 [2024-07-14 02:55:27.317002] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.317110] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.317217] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.317335] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.115 [2024-07-14 02:55:27.317574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.115 [2024-07-14 02:55:27.317603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.116 [2024-07-14 02:55:27.317659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.116 [2024-07-14 02:55:27.317674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.116 [2024-07-14 02:55:27.317728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.116 [2024-07-14 02:55:27.317742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.116 [2024-07-14 02:55:27.317796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.116 [2024-07-14 02:55:27.317810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.116 [2024-07-14 02:55:27.317866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.116 [2024-07-14 02:55:27.317880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:32.116 #51 NEW cov: 11820 ft: 14496 corp: 28/488b lim: 30 exec/s: 51 rss: 68Mb L: 30/30 MS: 1 InsertByte- 00:07:32.116 [2024-07-14 02:55:27.356983] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a1a1 00:07:32.116 [2024-07-14 02:55:27.357098] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a1a1 00:07:32.116 [2024-07-14 02:55:27.357205] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.116 [2024-07-14 02:55:27.357313] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.116 [2024-07-14 02:55:27.357545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.116 [2024-07-14 02:55:27.357572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.116 [2024-07-14 02:55:27.357628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a1a181a1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.116 [2024-07-14 02:55:27.357643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.116 [2024-07-14 02:55:27.357701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a1ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.116 [2024-07-14 02:55:27.357715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.116 [2024-07-14 02:55:27.357747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:000483ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.116 [2024-07-14 02:55:27.357761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.375 #52 NEW cov: 11820 ft: 14509 corp: 29/512b lim: 30 exec/s: 52 rss: 68Mb L: 24/30 MS: 1 InsertRepeatedBytes- 00:07:32.375 [2024-07-14 02:55:27.397034] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.375 [2024-07-14 02:55:27.397153] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524288) > buf size (4096) 00:07:32.375 [2024-07-14 02:55:27.397352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.397384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.397440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.397460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.375 #53 NEW cov: 11820 ft: 14530 corp: 30/527b lim: 30 exec/s: 53 rss: 68Mb L: 15/30 MS: 1 ChangeBinInt- 00:07:32.375 [2024-07-14 02:55:27.437167] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.375 [2024-07-14 02:55:27.437285] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7ff 00:07:32.375 [2024-07-14 02:55:27.437505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.437533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.437590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7fff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.437615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.375 #54 NEW cov: 11820 ft: 14532 corp: 31/542b lim: 30 exec/s: 54 rss: 68Mb L: 15/30 MS: 1 ShuffleBytes- 00:07:32.375 [2024-07-14 02:55:27.477304] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:07:32.375 [2024-07-14 02:55:27.477421] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.375 [2024-07-14 02:55:27.477533] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786432) > buf size (4096) 00:07:32.375 [2024-07-14 02:55:27.477749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.477776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.477834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.477848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.477903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.477917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.375 #55 NEW cov: 11820 ft: 14566 corp: 32/561b lim: 30 exec/s: 55 rss: 68Mb L: 19/30 MS: 1 CrossOver- 00:07:32.375 [2024-07-14 02:55:27.517311] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:07:32.375 [2024-07-14 02:55:27.517524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.517550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.375 #56 NEW cov: 11820 ft: 14989 corp: 33/572b lim: 30 exec/s: 56 rss: 68Mb L: 11/30 MS: 1 EraseBytes- 00:07:32.375 [2024-07-14 02:55:27.557527] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000006ff 00:07:32.375 [2024-07-14 02:55:27.557642] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.375 [2024-07-14 02:55:27.557744] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.375 [2024-07-14 02:55:27.557953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.557980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.558037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.558053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.558107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.558121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.375 #57 NEW cov: 11820 ft: 14993 corp: 34/595b lim: 30 exec/s: 57 rss: 68Mb L: 23/30 MS: 1 EraseBytes- 00:07:32.375 [2024-07-14 02:55:27.597659] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300002323 00:07:32.375 [2024-07-14 02:55:27.597776] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000023f7 00:07:32.375 [2024-07-14 02:55:27.597886] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.375 [2024-07-14 02:55:27.597992] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000003ff 00:07:32.375 [2024-07-14 02:55:27.598220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.598248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.598305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:23238323 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.598320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.598375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.598395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.375 [2024-07-14 02:55:27.598449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ff920292 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.375 [2024-07-14 02:55:27.598464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.375 #58 NEW cov: 11820 ft: 15021 corp: 35/622b lim: 30 exec/s: 58 rss: 68Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:32.635 [2024-07-14 02:55:27.637754] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.637869] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:32.635 [2024-07-14 02:55:27.637980] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff7a 00:07:32.635 [2024-07-14 02:55:27.638185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.638211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.638268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.638283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.638344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.638357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.635 #59 NEW cov: 11820 ft: 15024 corp: 36/642b lim: 30 exec/s: 59 rss: 68Mb L: 20/30 MS: 1 InsertByte- 00:07:32.635 [2024-07-14 02:55:27.677904] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.678019] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.678129] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009292 00:07:32.635 [2024-07-14 02:55:27.678354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.678382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.678438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.678459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.678516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.678531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.635 #60 NEW cov: 11820 ft: 15031 corp: 37/662b lim: 30 exec/s: 60 rss: 68Mb L: 20/30 MS: 1 InsertByte- 00:07:32.635 [2024-07-14 02:55:27.717980] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.718097] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4ff 00:07:32.635 [2024-07-14 02:55:27.718306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.718333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.718389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7fff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.718404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.635 #61 NEW cov: 11820 ft: 15036 corp: 38/677b lim: 30 exec/s: 61 rss: 69Mb L: 15/30 MS: 1 ChangeByte- 00:07:32.635 [2024-07-14 02:55:27.758095] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:07:32.635 [2024-07-14 02:55:27.758213] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:07:32.635 [2024-07-14 02:55:27.758535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.758563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.758617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.758633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.758687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.758705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.635 #62 NEW cov: 11820 ft: 15068 corp: 39/700b lim: 30 exec/s: 62 rss: 69Mb L: 23/30 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:32.635 [2024-07-14 02:55:27.798218] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.798335] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.798450] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.798655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.798682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.798737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.798753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.798807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.798821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.635 #63 NEW cov: 11820 ft: 15075 corp: 40/722b lim: 30 exec/s: 63 rss: 69Mb L: 22/30 MS: 1 CopyPart- 00:07:32.635 [2024-07-14 02:55:27.838330] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.838454] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:32.635 [2024-07-14 02:55:27.838566] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.838771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0afe83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.838798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.838852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.838867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.838919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.838933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.878473] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.878592] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff92 00:07:32.635 [2024-07-14 02:55:27.878702] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.635 [2024-07-14 02:55:27.878926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.878953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.879008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.879023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.635 [2024-07-14 02:55:27.879079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:92928392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.635 [2024-07-14 02:55:27.879093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.896 #65 NEW cov: 11820 ft: 15079 corp: 41/741b lim: 30 exec/s: 65 rss: 69Mb L: 19/30 MS: 2 ChangeBit-CopyPart- 00:07:32.896 [2024-07-14 02:55:27.918597] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:07:32.896 [2024-07-14 02:55:27.918725] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:32.896 [2024-07-14 02:55:27.918836] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:32.896 [2024-07-14 02:55:27.919045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.896 [2024-07-14 02:55:27.919073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.896 [2024-07-14 02:55:27.919126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.896 [2024-07-14 02:55:27.919141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.896 [2024-07-14 02:55:27.919195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.896 [2024-07-14 02:55:27.919209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.896 #66 NEW cov: 11820 ft: 15085 corp: 42/764b lim: 30 exec/s: 66 rss: 69Mb L: 23/30 MS: 1 CopyPart- 00:07:32.896 [2024-07-14 02:55:27.958730] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a1a1 00:07:32.896 [2024-07-14 02:55:27.958847] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a1a1 00:07:32.896 [2024-07-14 02:55:27.958961] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000107 00:07:32.896 [2024-07-14 02:55:27.959068] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.896 [2024-07-14 02:55:27.959277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.896 [2024-07-14 02:55:27.959304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.896 [2024-07-14 02:55:27.959358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a1a181a1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.896 [2024-07-14 02:55:27.959373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.896 [2024-07-14 02:55:27.959429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.896 [2024-07-14 02:55:27.959446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.896 [2024-07-14 02:55:27.959501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8f4e83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.896 [2024-07-14 02:55:27.959515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.896 #67 NEW cov: 11820 ft: 15110 corp: 43/788b lim: 30 exec/s: 33 rss: 69Mb L: 24/30 MS: 1 CMP- DE: "\377\377\377\377\001\007\217N"- 00:07:32.896 #67 DONE cov: 11820 ft: 15110 corp: 43/788b lim: 30 exec/s: 33 rss: 69Mb 00:07:32.896 ###### Recommended dictionary. ###### 00:07:32.896 "\001\000\000\000\000\000\000\000" # Uses: 2 00:07:32.896 "\000\004" # Uses: 1 00:07:32.896 "\377\377\377\377\001\007\217N" # Uses: 0 00:07:32.896 ###### End of recommended dictionary. ###### 00:07:32.896 Done 67 runs in 2 second(s) 00:07:32.896 02:55:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:32.896 02:55:28 -- ../common.sh@72 -- # (( i++ )) 00:07:32.896 02:55:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.896 02:55:28 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:32.896 02:55:28 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:32.896 02:55:28 -- nvmf/run.sh@24 -- # local timen=1 00:07:32.896 02:55:28 -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.896 02:55:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:32.896 02:55:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:32.896 02:55:28 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:32.896 02:55:28 -- nvmf/run.sh@29 -- # port=4402 00:07:32.896 02:55:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:32.896 02:55:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:32.896 02:55:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.896 02:55:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:32.896 [2024-07-14 02:55:28.141373] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:32.896 [2024-07-14 02:55:28.141453] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid667276 ] 00:07:33.156 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.156 [2024-07-14 02:55:28.406304] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.413 [2024-07-14 02:55:28.433625] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.413 [2024-07-14 02:55:28.433754] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.413 [2024-07-14 02:55:28.485492] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:33.413 [2024-07-14 02:55:28.501777] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:33.413 INFO: Running with entropic power schedule (0xFF, 100). 00:07:33.413 INFO: Seed: 2879075714 00:07:33.413 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:33.413 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:33.413 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:33.413 INFO: A corpus is not provided, starting from an empty corpus 00:07:33.413 #2 INITED exec/s: 0 rss: 59Mb 00:07:33.413 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:33.413 This may also happen if the target rejected all inputs we tried so far 00:07:33.413 [2024-07-14 02:55:28.573397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.413 [2024-07-14 02:55:28.573435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.413 [2024-07-14 02:55:28.573511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.413 [2024-07-14 02:55:28.573528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.413 [2024-07-14 02:55:28.573606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.413 [2024-07-14 02:55:28.573627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.413 [2024-07-14 02:55:28.573699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.413 [2024-07-14 02:55:28.573714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.670 NEW_FUNC[1/670]: 0x494b60 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:33.670 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:33.670 #7 NEW cov: 11504 ft: 11505 corp: 2/33b lim: 35 exec/s: 0 rss: 66Mb L: 32/32 MS: 5 CrossOver-ShuffleBytes-CrossOver-InsertByte-InsertRepeatedBytes- 00:07:33.670 [2024-07-14 02:55:28.913208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.670 [2024-07-14 02:55:28.913257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.670 [2024-07-14 02:55:28.913384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.670 [2024-07-14 02:55:28.913406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.670 [2024-07-14 02:55:28.913539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.670 [2024-07-14 02:55:28.913562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.927 #8 NEW cov: 11624 ft: 12565 corp: 3/55b lim: 35 exec/s: 0 rss: 66Mb L: 22/32 MS: 1 InsertRepeatedBytes- 00:07:33.927 [2024-07-14 02:55:28.953340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:28.953368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:28.953499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:ff00ebff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:28.953518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:28.953628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:28.953647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:28.953759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:28.953775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.927 #9 NEW cov: 11630 ft: 12972 corp: 4/87b lim: 35 exec/s: 0 rss: 66Mb L: 32/32 MS: 1 CrossOver- 00:07:33.927 [2024-07-14 02:55:28.993213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:28.993240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:28.993374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:28.993390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:28.993506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:28.993523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.927 #14 NEW cov: 11715 ft: 13215 corp: 5/111b lim: 35 exec/s: 0 rss: 66Mb L: 24/32 MS: 5 ShuffleBytes-CopyPart-CopyPart-ChangeByte-CrossOver- 00:07:33.927 [2024-07-14 02:55:29.033313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:878700b4 cdw11:87008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.033340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:29.033477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:87870087 cdw11:87008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.033494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:29.033616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:87870087 cdw11:87008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.033632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.927 #18 NEW cov: 11715 ft: 13255 corp: 6/137b lim: 35 exec/s: 0 rss: 66Mb L: 26/32 MS: 4 ChangeByte-InsertRepeatedBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:33.927 [2024-07-14 02:55:29.073450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.073477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:29.073605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.073622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:29.073737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.073753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.927 #19 NEW cov: 11715 ft: 13286 corp: 7/164b lim: 35 exec/s: 0 rss: 66Mb L: 27/32 MS: 1 InsertRepeatedBytes- 00:07:33.927 [2024-07-14 02:55:29.113171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1414000a cdw11:14001414 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.113198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.927 #21 NEW cov: 11715 ft: 13759 corp: 8/172b lim: 35 exec/s: 0 rss: 66Mb L: 8/32 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:33.927 [2024-07-14 02:55:29.153503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.153531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.927 [2024-07-14 02:55:29.153646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.927 [2024-07-14 02:55:29.153662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.185 #22 NEW cov: 11715 ft: 14048 corp: 9/187b lim: 35 exec/s: 0 rss: 66Mb L: 15/32 MS: 1 EraseBytes- 00:07:34.185 [2024-07-14 02:55:29.193840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:878700b4 cdw11:87008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.193867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.193982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:87870087 cdw11:a2008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.194015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.194129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:87870087 cdw11:87008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.194147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.185 #23 NEW cov: 11715 ft: 14153 corp: 10/213b lim: 35 exec/s: 0 rss: 66Mb L: 26/32 MS: 1 ChangeByte- 00:07:34.185 [2024-07-14 02:55:29.233739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.233765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.233885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.233902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.185 #24 NEW cov: 11715 ft: 14245 corp: 11/233b lim: 35 exec/s: 0 rss: 66Mb L: 20/32 MS: 1 EraseBytes- 00:07:34.185 [2024-07-14 02:55:29.274040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.274067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.274177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.274193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.274305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff0024ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.274321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.185 #25 NEW cov: 11715 ft: 14272 corp: 12/260b lim: 35 exec/s: 0 rss: 67Mb L: 27/32 MS: 1 ChangeByte- 00:07:34.185 [2024-07-14 02:55:29.314435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.314466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.314574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.314592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.314706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.314721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.314836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.314855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.185 #26 NEW cov: 11715 ft: 14304 corp: 13/293b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertByte- 00:07:34.185 [2024-07-14 02:55:29.354731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.354757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.354872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.354888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.355012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.355030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.355148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.355165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.355281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ebeb00eb cdw11:b600ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.355298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.185 #27 NEW cov: 11715 ft: 14363 corp: 14/328b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 CrossOver- 00:07:34.185 [2024-07-14 02:55:29.394661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.394690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.394808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.394826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.394950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.394966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.395089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:b600ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.395107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.185 #28 NEW cov: 11715 ft: 14372 corp: 15/356b lim: 35 exec/s: 0 rss: 67Mb L: 28/35 MS: 1 EraseBytes- 00:07:34.185 [2024-07-14 02:55:29.434869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0500000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.434895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.435019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.435040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.435154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.435174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.185 [2024-07-14 02:55:29.435289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.185 [2024-07-14 02:55:29.435306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.443 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:34.443 #29 NEW cov: 11738 ft: 14412 corp: 16/388b lim: 35 exec/s: 0 rss: 67Mb L: 32/35 MS: 1 ChangeBinInt- 00:07:34.443 [2024-07-14 02:55:29.474677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.474704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.474841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ff2d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.474857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.474973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.475005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.443 #30 NEW cov: 11738 ft: 14488 corp: 17/415b lim: 35 exec/s: 0 rss: 67Mb L: 27/35 MS: 1 ChangeByte- 00:07:34.443 [2024-07-14 02:55:29.515249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.515277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.515394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.515410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.515525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.515542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.515657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.515673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.515788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ebeb00eb cdw11:b600ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.515804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.443 #31 NEW cov: 11738 ft: 14514 corp: 18/450b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 CopyPart- 00:07:34.443 [2024-07-14 02:55:29.554885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.554912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.555042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.555060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.555180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.555197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.443 #32 NEW cov: 11738 ft: 14529 corp: 19/474b lim: 35 exec/s: 32 rss: 67Mb L: 24/35 MS: 1 ShuffleBytes- 00:07:34.443 [2024-07-14 02:55:29.595285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.595312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.595452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.443 [2024-07-14 02:55:29.595469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.443 [2024-07-14 02:55:29.595597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.595616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.444 [2024-07-14 02:55:29.595737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.595754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.444 #33 NEW cov: 11738 ft: 14558 corp: 20/507b lim: 35 exec/s: 33 rss: 67Mb L: 33/35 MS: 1 ChangeByte- 00:07:34.444 [2024-07-14 02:55:29.634734] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.444 [2024-07-14 02:55:29.635589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.635634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.444 [2024-07-14 02:55:29.635754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.635769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.444 [2024-07-14 02:55:29.635889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.635905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.444 [2024-07-14 02:55:29.636017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.636035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.444 [2024-07-14 02:55:29.636158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ebeb00eb cdw11:b600ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.636174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.444 #34 NEW cov: 11747 ft: 14606 corp: 21/542b lim: 35 exec/s: 34 rss: 67Mb L: 35/35 MS: 1 ChangeByte- 00:07:34.444 [2024-07-14 02:55:29.675119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.675147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.444 [2024-07-14 02:55:29.675277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.444 [2024-07-14 02:55:29.675296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.701 #35 NEW cov: 11747 ft: 14632 corp: 22/557b lim: 35 exec/s: 35 rss: 67Mb L: 15/35 MS: 1 ChangeBit- 00:07:34.701 [2024-07-14 02:55:29.715366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:878700b4 cdw11:87008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.701 [2024-07-14 02:55:29.715394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.701 [2024-07-14 02:55:29.715517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a2870087 cdw11:87008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.701 [2024-07-14 02:55:29.715533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.701 [2024-07-14 02:55:29.715650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:87870087 cdw11:87008787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.715667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.702 #36 NEW cov: 11747 ft: 14665 corp: 23/583b lim: 35 exec/s: 36 rss: 67Mb L: 26/35 MS: 1 ShuffleBytes- 00:07:34.702 [2024-07-14 02:55:29.755397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.755426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.755539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.755572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.702 #37 NEW cov: 11747 ft: 14675 corp: 24/597b lim: 35 exec/s: 37 rss: 68Mb L: 14/35 MS: 1 EraseBytes- 00:07:34.702 [2024-07-14 02:55:29.795658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.795686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.795803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.795819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.795933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:eb00ffeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.795949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.702 #38 NEW cov: 11747 ft: 14690 corp: 25/620b lim: 35 exec/s: 38 rss: 68Mb L: 23/35 MS: 1 CrossOver- 00:07:34.702 [2024-07-14 02:55:29.835717] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.702 [2024-07-14 02:55:29.836077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.836109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.836229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.836244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.836362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ebeb00eb cdw11:0000eb00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.836382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.836507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:eb000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.836526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.702 #39 NEW cov: 11747 ft: 14695 corp: 26/650b lim: 35 exec/s: 39 rss: 68Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:07:34.702 [2024-07-14 02:55:29.876389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.876416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.876533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff007f cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.876549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.876657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.876672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.876793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.876809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.876885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ebeb00eb cdw11:b600ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.876902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.702 #40 NEW cov: 11747 ft: 14698 corp: 27/685b lim: 35 exec/s: 40 rss: 68Mb L: 35/35 MS: 1 ChangeBit- 00:07:34.702 [2024-07-14 02:55:29.915904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.915933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.702 [2024-07-14 02:55:29.916052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.702 [2024-07-14 02:55:29.916072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.702 #41 NEW cov: 11747 ft: 14759 corp: 28/704b lim: 35 exec/s: 41 rss: 68Mb L: 19/35 MS: 1 CrossOver- 00:07:34.959 [2024-07-14 02:55:29.966652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.959 [2024-07-14 02:55:29.966680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.959 [2024-07-14 02:55:29.966790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.959 [2024-07-14 02:55:29.966807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.959 [2024-07-14 02:55:29.966928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:17ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.959 [2024-07-14 02:55:29.966946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.959 [2024-07-14 02:55:29.967071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.959 [2024-07-14 02:55:29.967090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.959 [2024-07-14 02:55:29.967214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ebeb00eb cdw11:b600ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.959 [2024-07-14 02:55:29.967231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.959 #42 NEW cov: 11747 ft: 14768 corp: 29/739b lim: 35 exec/s: 42 rss: 68Mb L: 35/35 MS: 1 ChangeByte- 00:07:34.959 [2024-07-14 02:55:30.008448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.959 [2024-07-14 02:55:30.008476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.008616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:eba300eb cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.008636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.960 #43 NEW cov: 11747 ft: 14860 corp: 30/754b lim: 35 exec/s: 43 rss: 68Mb L: 15/35 MS: 1 ChangeByte- 00:07:34.960 [2024-07-14 02:55:30.046316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.046344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.046460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.046478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.960 #44 NEW cov: 11747 ft: 14880 corp: 31/774b lim: 35 exec/s: 44 rss: 68Mb L: 20/35 MS: 1 CrossOver- 00:07:34.960 [2024-07-14 02:55:30.097112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.097142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.097263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.097282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.097399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.097416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.097534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.097551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.097669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ebeb00eb cdw11:b600ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.097687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.960 #45 NEW cov: 11747 ft: 14894 corp: 32/809b lim: 35 exec/s: 45 rss: 68Mb L: 35/35 MS: 1 CrossOver- 00:07:34.960 [2024-07-14 02:55:30.136357] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.960 [2024-07-14 02:55:30.137257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.137294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.137421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.137439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.137560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:fffd00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.137581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.137708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.137726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.137838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ebeb00eb cdw11:b600ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.137854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.960 #46 NEW cov: 11747 ft: 14918 corp: 33/844b lim: 35 exec/s: 46 rss: 68Mb L: 35/35 MS: 1 ChangeBit- 00:07:34.960 [2024-07-14 02:55:30.186931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.186960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.187078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.187094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.960 [2024-07-14 02:55:30.187219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.960 [2024-07-14 02:55:30.187239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.960 #47 NEW cov: 11747 ft: 14923 corp: 34/869b lim: 35 exec/s: 47 rss: 68Mb L: 25/35 MS: 1 EraseBytes- 00:07:35.218 [2024-07-14 02:55:30.226790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.226820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.226943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.226960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.218 #48 NEW cov: 11747 ft: 14924 corp: 35/889b lim: 35 exec/s: 48 rss: 68Mb L: 20/35 MS: 1 ShuffleBytes- 00:07:35.218 [2024-07-14 02:55:30.267386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.267413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.267543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:ff00ebff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.267562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.267682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.267699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.267840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.267858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.218 #49 NEW cov: 11747 ft: 14934 corp: 36/921b lim: 35 exec/s: 49 rss: 68Mb L: 32/35 MS: 1 CrossOver- 00:07:35.218 [2024-07-14 02:55:30.307371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00fff7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.307397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.307530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.307547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.307664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.307681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.218 #50 NEW cov: 11747 ft: 14947 corp: 37/946b lim: 35 exec/s: 50 rss: 69Mb L: 25/35 MS: 1 ChangeBinInt- 00:07:35.218 [2024-07-14 02:55:30.347263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.347291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.347407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:eb00ffeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.347422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.218 #51 NEW cov: 11747 ft: 14953 corp: 38/964b lim: 35 exec/s: 51 rss: 69Mb L: 18/35 MS: 1 EraseBytes- 00:07:35.218 [2024-07-14 02:55:30.387308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.387338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.387457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:eba300eb cdw11:2500ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.387476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.218 #52 NEW cov: 11747 ft: 14957 corp: 39/980b lim: 35 exec/s: 52 rss: 69Mb L: 16/35 MS: 1 InsertByte- 00:07:35.218 [2024-07-14 02:55:30.437881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.437907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.438038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:0a00ebff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.438059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.438176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.438193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.218 [2024-07-14 02:55:30.438311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.218 [2024-07-14 02:55:30.438328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.218 #53 NEW cov: 11747 ft: 14965 corp: 40/1012b lim: 35 exec/s: 53 rss: 70Mb L: 32/35 MS: 1 CrossOver- 00:07:35.476 [2024-07-14 02:55:30.477560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ebeb000a cdw11:eb00ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.476 [2024-07-14 02:55:30.477599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.476 [2024-07-14 02:55:30.477708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:2100ebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.476 [2024-07-14 02:55:30.477726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.476 #54 NEW cov: 11747 ft: 14978 corp: 41/1027b lim: 35 exec/s: 54 rss: 70Mb L: 15/35 MS: 1 ChangeByte- 00:07:35.476 [2024-07-14 02:55:30.518113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.476 [2024-07-14 02:55:30.518140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.476 [2024-07-14 02:55:30.518253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ebeb00eb cdw11:ff00ebff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.476 [2024-07-14 02:55:30.518286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.476 [2024-07-14 02:55:30.518404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.476 [2024-07-14 02:55:30.518421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.476 [2024-07-14 02:55:30.518544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffbf00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.477 [2024-07-14 02:55:30.518560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.477 #55 NEW cov: 11747 ft: 14979 corp: 42/1059b lim: 35 exec/s: 27 rss: 70Mb L: 32/35 MS: 1 ChangeBit- 00:07:35.477 #55 DONE cov: 11747 ft: 14979 corp: 42/1059b lim: 35 exec/s: 27 rss: 70Mb 00:07:35.477 Done 55 runs in 2 second(s) 00:07:35.477 02:55:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:35.477 02:55:30 -- ../common.sh@72 -- # (( i++ )) 00:07:35.477 02:55:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.477 02:55:30 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:35.477 02:55:30 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:35.477 02:55:30 -- nvmf/run.sh@24 -- # local timen=1 00:07:35.477 02:55:30 -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.477 02:55:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:35.477 02:55:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:35.477 02:55:30 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:35.477 02:55:30 -- nvmf/run.sh@29 -- # port=4403 00:07:35.477 02:55:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:35.477 02:55:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:35.477 02:55:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.477 02:55:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:35.477 [2024-07-14 02:55:30.700347] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:35.477 [2024-07-14 02:55:30.700412] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid667819 ] 00:07:35.735 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.735 [2024-07-14 02:55:30.950351] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.735 [2024-07-14 02:55:30.978946] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.735 [2024-07-14 02:55:30.979062] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.993 [2024-07-14 02:55:31.030413] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.993 [2024-07-14 02:55:31.046706] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:35.993 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.993 INFO: Seed: 1131076006 00:07:35.993 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:35.993 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:35.993 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:35.993 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.993 #2 INITED exec/s: 0 rss: 59Mb 00:07:35.993 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.993 This may also happen if the target rejected all inputs we tried so far 00:07:36.251 NEW_FUNC[1/659]: 0x496830 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:36.251 NEW_FUNC[2/659]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:36.251 #7 NEW cov: 11419 ft: 11412 corp: 2/9b lim: 20 exec/s: 0 rss: 66Mb L: 8/8 MS: 5 CopyPart-CrossOver-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:36.251 #8 NEW cov: 11536 ft: 12126 corp: 3/17b lim: 20 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 ChangeByte- 00:07:36.509 #11 NEW cov: 11559 ft: 12737 corp: 4/35b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 3 ChangeBinInt-ChangeBinInt-InsertRepeatedBytes- 00:07:36.509 #12 NEW cov: 11644 ft: 13046 corp: 5/53b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ChangeBit- 00:07:36.509 [2024-07-14 02:55:31.563510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.509 [2024-07-14 02:55:31.563549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.509 NEW_FUNC[1/19]: 0x1151830 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:36.509 NEW_FUNC[2/19]: 0x11523b0 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:36.509 #13 NEW cov: 11942 ft: 13522 corp: 6/71b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ChangeBinInt- 00:07:36.509 #16 NEW cov: 11942 ft: 13622 corp: 7/88b lim: 20 exec/s: 0 rss: 67Mb L: 17/18 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:36.509 #17 NEW cov: 11942 ft: 13781 corp: 8/105b lim: 20 exec/s: 0 rss: 67Mb L: 17/18 MS: 1 ChangeByte- 00:07:36.509 #18 NEW cov: 11942 ft: 13864 corp: 9/123b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 CrossOver- 00:07:36.509 #19 NEW cov: 11942 ft: 13913 corp: 10/141b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ShuffleBytes- 00:07:36.767 #20 NEW cov: 11942 ft: 13995 corp: 11/158b lim: 20 exec/s: 0 rss: 67Mb L: 17/18 MS: 1 ChangeBinInt- 00:07:36.767 #25 NEW cov: 11942 ft: 14037 corp: 12/169b lim: 20 exec/s: 0 rss: 67Mb L: 11/18 MS: 5 InsertByte-ChangeBinInt-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:36.767 #26 NEW cov: 11942 ft: 14140 corp: 13/177b lim: 20 exec/s: 0 rss: 67Mb L: 8/18 MS: 1 CMP- DE: "\000\000\000\002"- 00:07:36.767 #27 NEW cov: 11946 ft: 14239 corp: 14/189b lim: 20 exec/s: 0 rss: 68Mb L: 12/18 MS: 1 InsertRepeatedBytes- 00:07:36.767 #28 NEW cov: 11946 ft: 14268 corp: 15/200b lim: 20 exec/s: 0 rss: 68Mb L: 11/18 MS: 1 CMP- DE: "\001\016"- 00:07:36.767 #29 NEW cov: 11946 ft: 14287 corp: 16/219b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CopyPart- 00:07:37.026 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:37.026 #30 NEW cov: 11969 ft: 14343 corp: 17/236b lim: 20 exec/s: 0 rss: 68Mb L: 17/19 MS: 1 CrossOver- 00:07:37.026 #31 NEW cov: 11969 ft: 14369 corp: 18/253b lim: 20 exec/s: 0 rss: 68Mb L: 17/19 MS: 1 EraseBytes- 00:07:37.026 #32 NEW cov: 11969 ft: 14391 corp: 19/269b lim: 20 exec/s: 32 rss: 68Mb L: 16/19 MS: 1 EraseBytes- 00:07:37.026 #33 NEW cov: 11969 ft: 14415 corp: 20/277b lim: 20 exec/s: 33 rss: 68Mb L: 8/19 MS: 1 ChangeBit- 00:07:37.026 #34 NEW cov: 11969 ft: 14436 corp: 21/293b lim: 20 exec/s: 34 rss: 68Mb L: 16/19 MS: 1 ShuffleBytes- 00:07:37.026 [2024-07-14 02:55:32.195014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.026 [2024-07-14 02:55:32.195044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.026 #35 NEW cov: 11969 ft: 14465 corp: 22/304b lim: 20 exec/s: 35 rss: 68Mb L: 11/19 MS: 1 CopyPart- 00:07:37.026 #38 NEW cov: 11969 ft: 14475 corp: 23/320b lim: 20 exec/s: 38 rss: 68Mb L: 16/19 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:37.285 #39 NEW cov: 11969 ft: 14484 corp: 24/338b lim: 20 exec/s: 39 rss: 68Mb L: 18/19 MS: 1 ChangeByte- 00:07:37.285 #40 NEW cov: 11969 ft: 14491 corp: 25/353b lim: 20 exec/s: 40 rss: 69Mb L: 15/19 MS: 1 InsertRepeatedBytes- 00:07:37.285 #46 NEW cov: 11969 ft: 14502 corp: 26/372b lim: 20 exec/s: 46 rss: 69Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:37.285 #47 NEW cov: 11969 ft: 14537 corp: 27/392b lim: 20 exec/s: 47 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:07:37.285 #48 NEW cov: 11969 ft: 14544 corp: 28/400b lim: 20 exec/s: 48 rss: 69Mb L: 8/20 MS: 1 ChangeByte- 00:07:37.285 #49 NEW cov: 11969 ft: 14627 corp: 29/417b lim: 20 exec/s: 49 rss: 69Mb L: 17/20 MS: 1 ChangeBit- 00:07:37.285 [2024-07-14 02:55:32.505804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.285 [2024-07-14 02:55:32.505833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:2 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.285 #50 NEW cov: 11969 ft: 14861 corp: 30/422b lim: 20 exec/s: 50 rss: 69Mb L: 5/20 MS: 1 PersAutoDict- DE: "\000\000\000\002"- 00:07:37.542 #51 NEW cov: 11969 ft: 14876 corp: 31/442b lim: 20 exec/s: 51 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:37.543 #52 NEW cov: 11969 ft: 14900 corp: 32/459b lim: 20 exec/s: 52 rss: 69Mb L: 17/20 MS: 1 PersAutoDict- DE: "\001\016"- 00:07:37.543 #53 NEW cov: 11969 ft: 14932 corp: 33/477b lim: 20 exec/s: 53 rss: 69Mb L: 18/20 MS: 1 ChangeBinInt- 00:07:37.543 #54 NEW cov: 11969 ft: 14941 corp: 34/491b lim: 20 exec/s: 54 rss: 69Mb L: 14/20 MS: 1 EraseBytes- 00:07:37.543 #55 NEW cov: 11969 ft: 14977 corp: 35/498b lim: 20 exec/s: 55 rss: 69Mb L: 7/20 MS: 1 EraseBytes- 00:07:37.543 #56 NEW cov: 11969 ft: 14988 corp: 36/505b lim: 20 exec/s: 56 rss: 69Mb L: 7/20 MS: 1 EraseBytes- 00:07:37.801 #57 NEW cov: 11969 ft: 14991 corp: 37/522b lim: 20 exec/s: 57 rss: 69Mb L: 17/20 MS: 1 PersAutoDict- DE: "\000\000\000\002"- 00:07:37.801 [2024-07-14 02:55:32.827112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.801 [2024-07-14 02:55:32.827138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.801 #58 NEW cov: 11969 ft: 14995 corp: 38/541b lim: 20 exec/s: 58 rss: 69Mb L: 19/20 MS: 1 InsertRepeatedBytes- 00:07:37.801 #59 NEW cov: 11969 ft: 15071 corp: 39/546b lim: 20 exec/s: 59 rss: 69Mb L: 5/20 MS: 1 ChangeByte- 00:07:37.801 #61 NEW cov: 11969 ft: 15208 corp: 40/563b lim: 20 exec/s: 61 rss: 69Mb L: 17/20 MS: 2 InsertByte-ChangeBinInt- 00:07:37.801 #62 NEW cov: 11969 ft: 15215 corp: 41/577b lim: 20 exec/s: 62 rss: 70Mb L: 14/20 MS: 1 EraseBytes- 00:07:37.801 #63 NEW cov: 11969 ft: 15233 corp: 42/585b lim: 20 exec/s: 63 rss: 70Mb L: 8/20 MS: 1 ChangeBinInt- 00:07:38.060 #64 NEW cov: 11969 ft: 15236 corp: 43/603b lim: 20 exec/s: 64 rss: 70Mb L: 18/20 MS: 1 ShuffleBytes- 00:07:38.060 [2024-07-14 02:55:33.097884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.060 [2024-07-14 02:55:33.097911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.060 #65 NEW cov: 11969 ft: 15268 corp: 44/621b lim: 20 exec/s: 32 rss: 70Mb L: 18/20 MS: 1 ChangeBit- 00:07:38.060 #65 DONE cov: 11969 ft: 15268 corp: 44/621b lim: 20 exec/s: 32 rss: 70Mb 00:07:38.060 ###### Recommended dictionary. ###### 00:07:38.060 "\000\000\000\002" # Uses: 2 00:07:38.060 "\001\016" # Uses: 1 00:07:38.060 ###### End of recommended dictionary. ###### 00:07:38.060 Done 65 runs in 2 second(s) 00:07:38.060 02:55:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:38.060 02:55:33 -- ../common.sh@72 -- # (( i++ )) 00:07:38.060 02:55:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.060 02:55:33 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:38.060 02:55:33 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:38.060 02:55:33 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.060 02:55:33 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.060 02:55:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:38.060 02:55:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:38.060 02:55:33 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:38.060 02:55:33 -- nvmf/run.sh@29 -- # port=4404 00:07:38.060 02:55:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:38.060 02:55:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:38.060 02:55:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.060 02:55:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:38.060 [2024-07-14 02:55:33.280875] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:38.060 [2024-07-14 02:55:33.280963] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid668323 ] 00:07:38.318 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.318 [2024-07-14 02:55:33.538674] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.318 [2024-07-14 02:55:33.565291] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:38.318 [2024-07-14 02:55:33.565415] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.577 [2024-07-14 02:55:33.617043] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.577 [2024-07-14 02:55:33.633335] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:38.577 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.577 INFO: Seed: 3716072562 00:07:38.577 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:38.577 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:38.577 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:38.577 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.577 #2 INITED exec/s: 0 rss: 59Mb 00:07:38.577 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.577 This may also happen if the target rejected all inputs we tried so far 00:07:38.577 [2024-07-14 02:55:33.678933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.577 [2024-07-14 02:55:33.678961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.577 [2024-07-14 02:55:33.679014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.577 [2024-07-14 02:55:33.679028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.577 [2024-07-14 02:55:33.679080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.577 [2024-07-14 02:55:33.679093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.577 [2024-07-14 02:55:33.679144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.577 [2024-07-14 02:55:33.679157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.836 NEW_FUNC[1/670]: 0x497920 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:38.836 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.836 #3 NEW cov: 11514 ft: 11533 corp: 2/33b lim: 35 exec/s: 0 rss: 66Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:38.836 [2024-07-14 02:55:33.979745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.836 [2024-07-14 02:55:33.979782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.836 [2024-07-14 02:55:33.979846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3ab83a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.836 [2024-07-14 02:55:33.979862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.836 [2024-07-14 02:55:33.979917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.836 [2024-07-14 02:55:33.979933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.836 [2024-07-14 02:55:33.979990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.837 [2024-07-14 02:55:33.980006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.837 NEW_FUNC[1/1]: 0x19703c0 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:07:38.837 #4 NEW cov: 11645 ft: 12081 corp: 3/66b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertByte- 00:07:38.837 [2024-07-14 02:55:34.029477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.837 [2024-07-14 02:55:34.029504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.837 [2024-07-14 02:55:34.029558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.837 [2024-07-14 02:55:34.029572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.837 #5 NEW cov: 11651 ft: 12684 corp: 4/85b lim: 35 exec/s: 0 rss: 67Mb L: 19/33 MS: 1 EraseBytes- 00:07:38.837 [2024-07-14 02:55:34.069881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.837 [2024-07-14 02:55:34.069910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.837 [2024-07-14 02:55:34.069963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.837 [2024-07-14 02:55:34.069977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.837 [2024-07-14 02:55:34.070028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.837 [2024-07-14 02:55:34.070042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.837 [2024-07-14 02:55:34.070095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.837 [2024-07-14 02:55:34.070107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.096 #8 NEW cov: 11736 ft: 12904 corp: 5/116b lim: 35 exec/s: 0 rss: 67Mb L: 31/33 MS: 3 ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:39.096 [2024-07-14 02:55:34.109984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:fffdffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.096 [2024-07-14 02:55:34.110011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.096 [2024-07-14 02:55:34.110063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.096 [2024-07-14 02:55:34.110081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.096 [2024-07-14 02:55:34.110132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.096 [2024-07-14 02:55:34.110147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.096 [2024-07-14 02:55:34.110196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.096 [2024-07-14 02:55:34.110210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.096 #9 NEW cov: 11736 ft: 13032 corp: 6/147b lim: 35 exec/s: 0 rss: 67Mb L: 31/33 MS: 1 ChangeBit- 00:07:39.097 [2024-07-14 02:55:34.149757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:fffdffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.149783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.149836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.149851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.097 #10 NEW cov: 11736 ft: 13118 corp: 7/163b lim: 35 exec/s: 0 rss: 67Mb L: 16/33 MS: 1 EraseBytes- 00:07:39.097 [2024-07-14 02:55:34.189942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.189967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.190021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a270000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.190035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.097 #11 NEW cov: 11736 ft: 13157 corp: 8/183b lim: 35 exec/s: 0 rss: 67Mb L: 20/33 MS: 1 InsertByte- 00:07:39.097 [2024-07-14 02:55:34.230317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:fffdffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.230342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.230396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.230410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.230462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.230475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.230512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.230526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.097 #12 NEW cov: 11736 ft: 13253 corp: 9/214b lim: 35 exec/s: 0 rss: 67Mb L: 31/33 MS: 1 ShuffleBytes- 00:07:39.097 [2024-07-14 02:55:34.270137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.270165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.270219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a270000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.270232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.097 #13 NEW cov: 11736 ft: 13273 corp: 10/234b lim: 35 exec/s: 0 rss: 68Mb L: 20/33 MS: 1 CopyPart- 00:07:39.097 [2024-07-14 02:55:34.310594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.310619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.310674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fdffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.310688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.310739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.310754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.097 [2024-07-14 02:55:34.310804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a5b3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.097 [2024-07-14 02:55:34.310818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.097 #14 NEW cov: 11736 ft: 13326 corp: 11/267b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 CrossOver- 00:07:39.356 [2024-07-14 02:55:34.350404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:bec50003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.356 [2024-07-14 02:55:34.350431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.356 [2024-07-14 02:55:34.350488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c5c5c5c5 cdw11:c53a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.356 [2024-07-14 02:55:34.350504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.356 #15 NEW cov: 11736 ft: 13371 corp: 12/286b lim: 35 exec/s: 0 rss: 68Mb L: 19/33 MS: 1 ChangeBinInt- 00:07:39.356 [2024-07-14 02:55:34.390794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.356 [2024-07-14 02:55:34.390821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.356 [2024-07-14 02:55:34.390874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fdffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.356 [2024-07-14 02:55:34.390888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.356 [2024-07-14 02:55:34.390942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.356 [2024-07-14 02:55:34.390956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.391009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ff3affff cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.391025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.357 #16 NEW cov: 11736 ft: 13392 corp: 13/319b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 CrossOver- 00:07:39.357 [2024-07-14 02:55:34.430855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.430881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.430933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.430947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.430999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.431012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.431064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:be3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.431078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.357 #18 NEW cov: 11736 ft: 13433 corp: 14/348b lim: 35 exec/s: 0 rss: 68Mb L: 29/33 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:39.357 [2024-07-14 02:55:34.470712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.470738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.470792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:be3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.470807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.357 #19 NEW cov: 11736 ft: 13443 corp: 15/363b lim: 35 exec/s: 0 rss: 68Mb L: 15/33 MS: 1 EraseBytes- 00:07:39.357 [2024-07-14 02:55:34.511108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.511135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.511188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fdffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.511202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.511254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.511267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.511318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a5b3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.511331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.357 #20 NEW cov: 11736 ft: 13489 corp: 16/397b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 CopyPart- 00:07:39.357 [2024-07-14 02:55:34.551264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.551293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.551348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.551361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.551413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.551427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.551479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:ff3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.551493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.357 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:39.357 #21 NEW cov: 11759 ft: 13513 corp: 17/426b lim: 35 exec/s: 0 rss: 68Mb L: 29/34 MS: 1 ChangeByte- 00:07:39.357 [2024-07-14 02:55:34.591324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.591351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.591405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.591420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.591486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.591501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.357 [2024-07-14 02:55:34.591553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.357 [2024-07-14 02:55:34.591567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.617 #22 NEW cov: 11759 ft: 13536 corp: 18/457b lim: 35 exec/s: 0 rss: 68Mb L: 31/34 MS: 1 ChangeByte- 00:07:39.617 [2024-07-14 02:55:34.631188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:bec50003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.631214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.631268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c5c5c5c5 cdw11:c53a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.631283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.617 #23 NEW cov: 11759 ft: 13604 corp: 19/476b lim: 35 exec/s: 0 rss: 68Mb L: 19/34 MS: 1 ShuffleBytes- 00:07:39.617 [2024-07-14 02:55:34.671608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.671636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.671692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.671706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.671759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.671773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.671823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.671837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.617 #24 NEW cov: 11759 ft: 13627 corp: 20/510b lim: 35 exec/s: 24 rss: 68Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:39.617 [2024-07-14 02:55:34.711735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.711763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.711816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fdffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.711830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.711880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff3affff cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.711895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.711945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a3a3a cdw11:273a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.711959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.617 #25 NEW cov: 11759 ft: 13670 corp: 21/544b lim: 35 exec/s: 25 rss: 69Mb L: 34/34 MS: 1 CrossOver- 00:07:39.617 [2024-07-14 02:55:34.751877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.751902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.751954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0100ffff cdw11:02000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.751968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.752018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.752032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.752082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.752095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.617 #26 NEW cov: 11759 ft: 13688 corp: 22/575b lim: 35 exec/s: 26 rss: 69Mb L: 31/34 MS: 1 CMP- DE: "\001\000\002\000"- 00:07:39.617 [2024-07-14 02:55:34.791663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:bec50003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.791688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.791740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c5c5c5cc cdw11:c53a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.791754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.617 #27 NEW cov: 11759 ft: 13771 corp: 23/594b lim: 35 exec/s: 27 rss: 69Mb L: 19/34 MS: 1 ChangeBinInt- 00:07:39.617 [2024-07-14 02:55:34.832073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.832100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.832152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.832166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.832217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.832231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.617 [2024-07-14 02:55:34.832281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.617 [2024-07-14 02:55:34.832295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.617 #28 NEW cov: 11759 ft: 13810 corp: 24/626b lim: 35 exec/s: 28 rss: 69Mb L: 32/34 MS: 1 CopyPart- 00:07:39.876 [2024-07-14 02:55:34.872185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.872213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:34.872268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.872282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:34.872334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.872349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:34.872399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.872412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.876 #29 NEW cov: 11759 ft: 13813 corp: 25/655b lim: 35 exec/s: 29 rss: 69Mb L: 29/34 MS: 1 CrossOver- 00:07:39.876 [2024-07-14 02:55:34.911979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100ffff cdw11:02000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.912005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:34.912057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.912071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.876 #30 NEW cov: 11759 ft: 13825 corp: 26/671b lim: 35 exec/s: 30 rss: 69Mb L: 16/34 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:07:39.876 [2024-07-14 02:55:34.952117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.952142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:34.952192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a270000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.952205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.876 #31 NEW cov: 11759 ft: 13846 corp: 27/690b lim: 35 exec/s: 31 rss: 69Mb L: 19/34 MS: 1 EraseBytes- 00:07:39.876 [2024-07-14 02:55:34.992563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.992589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:34.992643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:beba0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.992656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:34.992705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.992719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:34.992768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:ff3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:34.992781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.876 #32 NEW cov: 11759 ft: 13856 corp: 28/719b lim: 35 exec/s: 32 rss: 69Mb L: 29/34 MS: 1 ChangeBit- 00:07:39.876 [2024-07-14 02:55:35.032671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:35.032697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:35.032749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fdffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.876 [2024-07-14 02:55:35.032762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.876 [2024-07-14 02:55:35.032812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff3affff cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.877 [2024-07-14 02:55:35.032826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.877 [2024-07-14 02:55:35.032874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3affff cdw11:273a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.877 [2024-07-14 02:55:35.032887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.877 #33 NEW cov: 11759 ft: 13879 corp: 29/753b lim: 35 exec/s: 33 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:07:39.877 [2024-07-14 02:55:35.072804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.877 [2024-07-14 02:55:35.072831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.877 [2024-07-14 02:55:35.072882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.877 [2024-07-14 02:55:35.072896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.877 [2024-07-14 02:55:35.072947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.877 [2024-07-14 02:55:35.072961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.877 [2024-07-14 02:55:35.073010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.877 [2024-07-14 02:55:35.073023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.877 #34 NEW cov: 11759 ft: 13898 corp: 30/784b lim: 35 exec/s: 34 rss: 69Mb L: 31/34 MS: 1 ChangeBit- 00:07:39.877 [2024-07-14 02:55:35.112583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.877 [2024-07-14 02:55:35.112610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.877 [2024-07-14 02:55:35.112661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a140000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.877 [2024-07-14 02:55:35.112674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.136 #35 NEW cov: 11759 ft: 13903 corp: 31/804b lim: 35 exec/s: 35 rss: 69Mb L: 20/34 MS: 1 ChangeBinInt- 00:07:40.136 [2024-07-14 02:55:35.142984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.143010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.143063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.143077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.143127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.143141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.143194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff2c0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.143206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.136 #36 NEW cov: 11759 ft: 13984 corp: 32/835b lim: 35 exec/s: 36 rss: 69Mb L: 31/34 MS: 1 ChangeByte- 00:07:40.136 [2024-07-14 02:55:35.183094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.183123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.183174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.183188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.183237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.183251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.183300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:be00bebe cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.183313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.136 #37 NEW cov: 11759 ft: 13996 corp: 33/866b lim: 35 exec/s: 37 rss: 69Mb L: 31/34 MS: 1 CrossOver- 00:07:40.136 [2024-07-14 02:55:35.212715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.212741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.136 #38 NEW cov: 11759 ft: 14767 corp: 34/876b lim: 35 exec/s: 38 rss: 69Mb L: 10/34 MS: 1 EraseBytes- 00:07:40.136 [2024-07-14 02:55:35.253130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.253156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.253205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a140000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.253219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.253270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3a3a3a3a cdw11:3a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.253284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.136 #39 NEW cov: 11759 ft: 14982 corp: 35/900b lim: 35 exec/s: 39 rss: 69Mb L: 24/34 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:07:40.136 [2024-07-14 02:55:35.293408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.293433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.293488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fdffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.293502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.293553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff3affff cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.293566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.293616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a3a3a cdw11:273a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.293632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.136 #40 NEW cov: 11759 ft: 14988 corp: 36/934b lim: 35 exec/s: 40 rss: 69Mb L: 34/34 MS: 1 ChangeBit- 00:07:40.136 [2024-07-14 02:55:35.333572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.333606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.333659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.333673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.333723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.136 [2024-07-14 02:55:35.333737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.136 [2024-07-14 02:55:35.333788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:4141ff41 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.137 [2024-07-14 02:55:35.333801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.137 #41 NEW cov: 11759 ft: 14989 corp: 37/968b lim: 35 exec/s: 41 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:40.137 [2024-07-14 02:55:35.363629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.137 [2024-07-14 02:55:35.363656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.137 [2024-07-14 02:55:35.363706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.137 [2024-07-14 02:55:35.363721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.137 [2024-07-14 02:55:35.363758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.137 [2024-07-14 02:55:35.363771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.137 [2024-07-14 02:55:35.363819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:4141ff41 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.137 [2024-07-14 02:55:35.363832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.397 #42 NEW cov: 11759 ft: 14994 corp: 38/1002b lim: 35 exec/s: 42 rss: 70Mb L: 34/34 MS: 1 ChangeByte- 00:07:40.397 [2024-07-14 02:55:35.403729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.403755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.403809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a270003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.403823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.403872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.403889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.403933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3ae6e6 cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.403946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.397 #43 NEW cov: 11759 ft: 15006 corp: 39/1032b lim: 35 exec/s: 43 rss: 70Mb L: 30/34 MS: 1 InsertRepeatedBytes- 00:07:40.397 [2024-07-14 02:55:35.433819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.433844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.433895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3ab83a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.433909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.433957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.433971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.434021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a3a3a cdw11:213a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.434034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.397 #44 NEW cov: 11759 ft: 15010 corp: 40/1065b lim: 35 exec/s: 44 rss: 70Mb L: 33/34 MS: 1 ChangeBinInt- 00:07:40.397 [2024-07-14 02:55:35.473605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.473630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.473681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.473695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.397 #45 NEW cov: 11759 ft: 15026 corp: 41/1083b lim: 35 exec/s: 45 rss: 70Mb L: 18/34 MS: 1 EraseBytes- 00:07:40.397 [2024-07-14 02:55:35.513902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.513928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.513980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3a3a003a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.513993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.514044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3a3a273a cdw11:3a3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.514057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.397 #46 NEW cov: 11759 ft: 15040 corp: 42/1105b lim: 35 exec/s: 46 rss: 70Mb L: 22/34 MS: 1 CMP- DE: "\001\000"- 00:07:40.397 [2024-07-14 02:55:35.553854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bebeb6be cdw11:bebe0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.553880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.553931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:be3a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.553944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.397 #47 NEW cov: 11759 ft: 15059 corp: 43/1120b lim: 35 exec/s: 47 rss: 70Mb L: 15/34 MS: 1 ChangeBit- 00:07:40.397 [2024-07-14 02:55:35.594288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.594314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.594367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fdffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.594381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.594432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff3affff cdw11:273a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.594450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.594504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3a3a3a3a cdw11:273a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.594518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.397 #48 NEW cov: 11759 ft: 15060 corp: 44/1154b lim: 35 exec/s: 48 rss: 70Mb L: 34/34 MS: 1 ChangeByte- 00:07:40.397 [2024-07-14 02:55:35.624064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.624089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.397 [2024-07-14 02:55:35.624142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.397 [2024-07-14 02:55:35.624156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.657 #49 NEW cov: 11759 ft: 15065 corp: 45/1172b lim: 35 exec/s: 49 rss: 70Mb L: 18/34 MS: 1 ShuffleBytes- 00:07:40.657 [2024-07-14 02:55:35.664181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.657 [2024-07-14 02:55:35.664207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.657 [2024-07-14 02:55:35.664259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.657 [2024-07-14 02:55:35.664272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.657 #50 NEW cov: 11759 ft: 15072 corp: 46/1190b lim: 35 exec/s: 25 rss: 70Mb L: 18/34 MS: 1 ShuffleBytes- 00:07:40.657 #50 DONE cov: 11759 ft: 15072 corp: 46/1190b lim: 35 exec/s: 25 rss: 70Mb 00:07:40.657 ###### Recommended dictionary. ###### 00:07:40.657 "\001\000\002\000" # Uses: 2 00:07:40.657 "\001\000" # Uses: 0 00:07:40.657 ###### End of recommended dictionary. ###### 00:07:40.657 Done 50 runs in 2 second(s) 00:07:40.657 02:55:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:40.657 02:55:35 -- ../common.sh@72 -- # (( i++ )) 00:07:40.657 02:55:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.657 02:55:35 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:40.657 02:55:35 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:40.657 02:55:35 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.657 02:55:35 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.657 02:55:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:40.657 02:55:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:40.657 02:55:35 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:40.657 02:55:35 -- nvmf/run.sh@29 -- # port=4405 00:07:40.657 02:55:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:40.657 02:55:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:40.657 02:55:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.657 02:55:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:40.657 [2024-07-14 02:55:35.838250] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:40.657 [2024-07-14 02:55:35.838317] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid668656 ] 00:07:40.657 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.917 [2024-07-14 02:55:36.083408] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.917 [2024-07-14 02:55:36.111592] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.917 [2024-07-14 02:55:36.111711] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.917 [2024-07-14 02:55:36.163123] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:41.176 [2024-07-14 02:55:36.179411] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:41.176 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.176 INFO: Seed: 1967112270 00:07:41.176 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:41.176 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:41.176 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:41.176 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.176 #2 INITED exec/s: 0 rss: 59Mb 00:07:41.176 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.176 This may also happen if the target rejected all inputs we tried so far 00:07:41.176 [2024-07-14 02:55:36.246336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b6b6b6 cdw11:b6b60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.176 [2024-07-14 02:55:36.246375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.434 NEW_FUNC[1/671]: 0x499ab0 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:41.434 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:41.434 #4 NEW cov: 11543 ft: 11536 corp: 2/13b lim: 45 exec/s: 0 rss: 66Mb L: 12/12 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:07:41.434 [2024-07-14 02:55:36.586617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.434 [2024-07-14 02:55:36.586662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.434 #5 NEW cov: 11656 ft: 12239 corp: 3/25b lim: 45 exec/s: 0 rss: 67Mb L: 12/12 MS: 1 CMP- DE: "\001\000\000\022"- 00:07:41.434 [2024-07-14 02:55:36.636580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.434 [2024-07-14 02:55:36.636609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.434 #6 NEW cov: 11662 ft: 12466 corp: 4/41b lim: 45 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 PersAutoDict- DE: "\001\000\000\022"- 00:07:41.434 [2024-07-14 02:55:36.676687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b6b6b6 cdw11:b6b60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.434 [2024-07-14 02:55:36.676714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.693 #7 NEW cov: 11747 ft: 12787 corp: 5/53b lim: 45 exec/s: 0 rss: 67Mb L: 12/16 MS: 1 ChangeByte- 00:07:41.693 [2024-07-14 02:55:36.716708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b6b6b6 cdw11:b6b60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.693 [2024-07-14 02:55:36.716739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.693 #8 NEW cov: 11747 ft: 12873 corp: 6/65b lim: 45 exec/s: 0 rss: 67Mb L: 12/16 MS: 1 ChangeBit- 00:07:41.693 [2024-07-14 02:55:36.756926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6b60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.693 [2024-07-14 02:55:36.756954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.693 #9 NEW cov: 11747 ft: 12965 corp: 7/77b lim: 45 exec/s: 0 rss: 67Mb L: 12/16 MS: 1 ChangeBit- 00:07:41.693 [2024-07-14 02:55:36.797085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.693 [2024-07-14 02:55:36.797111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.693 #15 NEW cov: 11747 ft: 13032 corp: 8/93b lim: 45 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 PersAutoDict- DE: "\001\000\000\022"- 00:07:41.693 [2024-07-14 02:55:36.837205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.693 [2024-07-14 02:55:36.837230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.693 #16 NEW cov: 11747 ft: 13065 corp: 9/106b lim: 45 exec/s: 0 rss: 67Mb L: 13/16 MS: 1 InsertByte- 00:07:41.693 [2024-07-14 02:55:36.877217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.693 [2024-07-14 02:55:36.877245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.693 #17 NEW cov: 11747 ft: 13161 corp: 10/123b lim: 45 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 InsertByte- 00:07:41.693 [2024-07-14 02:55:36.917507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.693 [2024-07-14 02:55:36.917533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.693 #23 NEW cov: 11747 ft: 13271 corp: 11/136b lim: 45 exec/s: 0 rss: 68Mb L: 13/17 MS: 1 ChangeByte- 00:07:41.953 [2024-07-14 02:55:36.967867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6feb6b6 cdw11:fefe0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.953 [2024-07-14 02:55:36.967894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.953 [2024-07-14 02:55:36.968009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fe01fefe cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.953 [2024-07-14 02:55:36.968026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.953 #24 NEW cov: 11747 ft: 14087 corp: 12/158b lim: 45 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:41.953 [2024-07-14 02:55:37.007685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.953 [2024-07-14 02:55:37.007712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.953 #25 NEW cov: 11747 ft: 14092 corp: 13/170b lim: 45 exec/s: 0 rss: 68Mb L: 12/22 MS: 1 EraseBytes- 00:07:41.953 [2024-07-14 02:55:37.047789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b653b6 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.953 [2024-07-14 02:55:37.047816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.953 #26 NEW cov: 11747 ft: 14115 corp: 14/187b lim: 45 exec/s: 0 rss: 68Mb L: 17/22 MS: 1 InsertByte- 00:07:41.953 [2024-07-14 02:55:37.087943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0100b6b6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.953 [2024-07-14 02:55:37.087969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.953 #27 NEW cov: 11747 ft: 14130 corp: 15/199b lim: 45 exec/s: 0 rss: 68Mb L: 12/22 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:41.953 [2024-07-14 02:55:37.128014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.953 [2024-07-14 02:55:37.128040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.953 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.953 #28 NEW cov: 11770 ft: 14161 corp: 16/215b lim: 45 exec/s: 0 rss: 68Mb L: 16/22 MS: 1 CopyPart- 00:07:41.953 [2024-07-14 02:55:37.168170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:4db60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.953 [2024-07-14 02:55:37.168197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.953 #29 NEW cov: 11770 ft: 14177 corp: 17/225b lim: 45 exec/s: 0 rss: 68Mb L: 10/22 MS: 1 EraseBytes- 00:07:42.212 [2024-07-14 02:55:37.208403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.212 [2024-07-14 02:55:37.208430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.212 #30 NEW cov: 11770 ft: 14191 corp: 18/242b lim: 45 exec/s: 30 rss: 68Mb L: 17/22 MS: 1 InsertByte- 00:07:42.212 [2024-07-14 02:55:37.248437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:494d4349 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.212 [2024-07-14 02:55:37.248466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.212 #36 NEW cov: 11770 ft: 14193 corp: 19/259b lim: 45 exec/s: 36 rss: 68Mb L: 17/22 MS: 1 ChangeBinInt- 00:07:42.212 [2024-07-14 02:55:37.288590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:05000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.212 [2024-07-14 02:55:37.288615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.212 #37 NEW cov: 11770 ft: 14208 corp: 20/275b lim: 45 exec/s: 37 rss: 68Mb L: 16/22 MS: 1 ChangeBit- 00:07:42.212 [2024-07-14 02:55:37.328744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:051a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.212 [2024-07-14 02:55:37.328770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.212 #38 NEW cov: 11770 ft: 14220 corp: 21/291b lim: 45 exec/s: 38 rss: 68Mb L: 16/22 MS: 1 ChangeByte- 00:07:42.212 [2024-07-14 02:55:37.368819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:494d4349 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.212 [2024-07-14 02:55:37.368848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.212 #39 NEW cov: 11770 ft: 14259 corp: 22/308b lim: 45 exec/s: 39 rss: 68Mb L: 17/22 MS: 1 CopyPart- 00:07:42.212 [2024-07-14 02:55:37.409145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:001d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.212 [2024-07-14 02:55:37.409173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.212 [2024-07-14 02:55:37.409293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1d1d1d1d cdw11:1d000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.212 [2024-07-14 02:55:37.409311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.212 #40 NEW cov: 11770 ft: 14298 corp: 23/329b lim: 45 exec/s: 40 rss: 69Mb L: 21/22 MS: 1 InsertRepeatedBytes- 00:07:42.212 [2024-07-14 02:55:37.459401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.212 [2024-07-14 02:55:37.459430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.212 [2024-07-14 02:55:37.459554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b6e4b6b6 cdw11:b6b60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.213 [2024-07-14 02:55:37.459571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.471 #41 NEW cov: 11770 ft: 14304 corp: 24/349b lim: 45 exec/s: 41 rss: 69Mb L: 20/22 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:42.471 [2024-07-14 02:55:37.500054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:494d4349 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.500081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.471 [2024-07-14 02:55:37.500198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1c1c1c1c cdw11:1c1c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.500215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.471 [2024-07-14 02:55:37.500334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1c1c1c1c cdw11:1c1c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.500352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.471 [2024-07-14 02:55:37.500470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b610b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.500488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.471 #42 NEW cov: 11770 ft: 14730 corp: 25/385b lim: 45 exec/s: 42 rss: 69Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:42.471 [2024-07-14 02:55:37.539362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b6b6b6 cdw11:b6b60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.539388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.471 #43 NEW cov: 11770 ft: 14755 corp: 26/397b lim: 45 exec/s: 43 rss: 69Mb L: 12/36 MS: 1 ChangeBinInt- 00:07:42.471 [2024-07-14 02:55:37.589856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.589883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.471 [2024-07-14 02:55:37.589994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b6e4b6b6 cdw11:b6b60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.590010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.471 #44 NEW cov: 11770 ft: 14795 corp: 27/420b lim: 45 exec/s: 44 rss: 69Mb L: 23/36 MS: 1 CopyPart- 00:07:42.471 [2024-07-14 02:55:37.629670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b629b6b6 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.629696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.471 #45 NEW cov: 11770 ft: 14796 corp: 28/436b lim: 45 exec/s: 45 rss: 69Mb L: 16/36 MS: 1 ChangeByte- 00:07:42.471 [2024-07-14 02:55:37.669756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.669784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.471 #46 NEW cov: 11770 ft: 14807 corp: 29/453b lim: 45 exec/s: 46 rss: 69Mb L: 17/36 MS: 1 ChangeBinInt- 00:07:42.471 [2024-07-14 02:55:37.709853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b653b6 cdw11:00120005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.471 [2024-07-14 02:55:37.709880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.729 #47 NEW cov: 11770 ft: 14824 corp: 30/464b lim: 45 exec/s: 47 rss: 69Mb L: 11/36 MS: 1 EraseBytes- 00:07:42.729 [2024-07-14 02:55:37.750293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:05000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.729 [2024-07-14 02:55:37.750321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.729 [2024-07-14 02:55:37.750435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:12000012 cdw11:00120005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.729 [2024-07-14 02:55:37.750456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.729 #48 NEW cov: 11770 ft: 14891 corp: 31/484b lim: 45 exec/s: 48 rss: 69Mb L: 20/36 MS: 1 PersAutoDict- DE: "\001\000\000\022"- 00:07:42.730 [2024-07-14 02:55:37.790364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6feb6b6 cdw11:fefe0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.790392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.730 [2024-07-14 02:55:37.790512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00fefefe cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.790529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.730 #49 NEW cov: 11770 ft: 14924 corp: 32/506b lim: 45 exec/s: 49 rss: 69Mb L: 22/36 MS: 1 ShuffleBytes- 00:07:42.730 [2024-07-14 02:55:37.840292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b629b6b6 cdw11:01290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.840319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.730 #50 NEW cov: 11770 ft: 14928 corp: 33/523b lim: 45 exec/s: 50 rss: 69Mb L: 17/36 MS: 1 InsertByte- 00:07:42.730 [2024-07-14 02:55:37.880330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.880356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.730 [2024-07-14 02:55:37.880502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1031b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.880519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.730 #51 NEW cov: 11770 ft: 14955 corp: 34/541b lim: 45 exec/s: 51 rss: 69Mb L: 18/36 MS: 1 InsertByte- 00:07:42.730 [2024-07-14 02:55:37.920796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.920822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.730 [2024-07-14 02:55:37.920937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b6e4b6b6 cdw11:08000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.920955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.730 #52 NEW cov: 11770 ft: 14969 corp: 35/565b lim: 45 exec/s: 52 rss: 69Mb L: 24/36 MS: 1 CMP- DE: "\010\000\000\000"- 00:07:42.730 [2024-07-14 02:55:37.970887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.970914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.730 [2024-07-14 02:55:37.971030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b6e4b6b6 cdw11:08000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.730 [2024-07-14 02:55:37.971046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.988 #53 NEW cov: 11770 ft: 14975 corp: 36/589b lim: 45 exec/s: 53 rss: 69Mb L: 24/36 MS: 1 ChangeByte- 00:07:42.988 [2024-07-14 02:55:38.010766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b653b6 cdw11:00120005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.988 [2024-07-14 02:55:38.010793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.988 #54 NEW cov: 11770 ft: 14982 corp: 37/603b lim: 45 exec/s: 54 rss: 70Mb L: 14/36 MS: 1 CopyPart- 00:07:42.988 [2024-07-14 02:55:38.051436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6feb6b6 cdw11:fefe0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.988 [2024-07-14 02:55:38.051481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.988 [2024-07-14 02:55:38.051605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0001fefe cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.988 [2024-07-14 02:55:38.051623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.988 [2024-07-14 02:55:38.051744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fe010000 cdw11:00b60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.988 [2024-07-14 02:55:38.051762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.988 #55 NEW cov: 11770 ft: 15212 corp: 38/633b lim: 45 exec/s: 55 rss: 70Mb L: 30/36 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:42.989 [2024-07-14 02:55:38.101548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b629b6b6 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.101574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.989 [2024-07-14 02:55:38.101683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.101699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.989 [2024-07-14 02:55:38.101811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0000ff01 cdw11:12000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.101827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.989 #56 NEW cov: 11770 ft: 15231 corp: 39/664b lim: 45 exec/s: 56 rss: 70Mb L: 31/36 MS: 1 InsertRepeatedBytes- 00:07:42.989 [2024-07-14 02:55:38.151200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:05000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.151228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.989 #57 NEW cov: 11770 ft: 15344 corp: 40/680b lim: 45 exec/s: 57 rss: 70Mb L: 16/36 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:42.989 [2024-07-14 02:55:38.192158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b601b6b6 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.192183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.989 [2024-07-14 02:55:38.192307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b6b60012 cdw11:b6b60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.192326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.989 [2024-07-14 02:55:38.192445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:19191919 cdw11:19190000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.192461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.989 [2024-07-14 02:55:38.192579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:19191919 cdw11:19190000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.192609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.989 #58 NEW cov: 11770 ft: 15346 corp: 41/723b lim: 45 exec/s: 58 rss: 70Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:07:42.989 [2024-07-14 02:55:38.232221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6b2b6b6 cdw11:b6e40005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.232247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.989 [2024-07-14 02:55:38.232380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.232399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.989 [2024-07-14 02:55:38.232517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff01ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.232533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.989 [2024-07-14 02:55:38.232667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b6b612b6 cdw11:b6b60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.989 [2024-07-14 02:55:38.232683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.248 #59 NEW cov: 11770 ft: 15369 corp: 42/765b lim: 45 exec/s: 29 rss: 70Mb L: 42/43 MS: 1 CrossOver- 00:07:43.248 #59 DONE cov: 11770 ft: 15369 corp: 42/765b lim: 45 exec/s: 29 rss: 70Mb 00:07:43.248 ###### Recommended dictionary. ###### 00:07:43.248 "\001\000\000\022" # Uses: 3 00:07:43.248 "\001\000\000\000\000\000\000\000" # Uses: 1 00:07:43.248 "\000\000\000\000" # Uses: 1 00:07:43.248 "\010\000\000\000" # Uses: 0 00:07:43.248 ###### End of recommended dictionary. ###### 00:07:43.248 Done 59 runs in 2 second(s) 00:07:43.248 02:55:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:43.248 02:55:38 -- ../common.sh@72 -- # (( i++ )) 00:07:43.248 02:55:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:43.248 02:55:38 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:43.248 02:55:38 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:43.248 02:55:38 -- nvmf/run.sh@24 -- # local timen=1 00:07:43.248 02:55:38 -- nvmf/run.sh@25 -- # local core=0x1 00:07:43.248 02:55:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:43.248 02:55:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:43.248 02:55:38 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:43.248 02:55:38 -- nvmf/run.sh@29 -- # port=4406 00:07:43.248 02:55:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:43.248 02:55:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:43.248 02:55:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:43.248 02:55:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:43.248 [2024-07-14 02:55:38.410510] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:43.248 [2024-07-14 02:55:38.410572] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid669194 ] 00:07:43.248 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.507 [2024-07-14 02:55:38.659016] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.507 [2024-07-14 02:55:38.687469] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:43.507 [2024-07-14 02:55:38.687583] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.507 [2024-07-14 02:55:38.738896] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.507 [2024-07-14 02:55:38.755203] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:43.765 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.765 INFO: Seed: 248142615 00:07:43.765 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:43.765 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:43.765 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:43.765 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.765 #2 INITED exec/s: 0 rss: 59Mb 00:07:43.765 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.765 This may also happen if the target rejected all inputs we tried so far 00:07:43.765 [2024-07-14 02:55:38.800295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:43.765 [2024-07-14 02:55:38.800322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.024 NEW_FUNC[1/669]: 0x49c2c0 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:44.024 NEW_FUNC[2/669]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:44.024 #3 NEW cov: 11460 ft: 11461 corp: 2/3b lim: 10 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CrossOver- 00:07:44.024 [2024-07-14 02:55:39.101222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.024 [2024-07-14 02:55:39.101255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.024 #4 NEW cov: 11573 ft: 11887 corp: 3/5b lim: 10 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:44.024 [2024-07-14 02:55:39.141208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.024 [2024-07-14 02:55:39.141237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.024 #5 NEW cov: 11579 ft: 12207 corp: 4/7b lim: 10 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CrossOver- 00:07:44.024 [2024-07-14 02:55:39.171255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:44.024 [2024-07-14 02:55:39.171282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.024 #6 NEW cov: 11664 ft: 12533 corp: 5/9b lim: 10 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:44.024 [2024-07-14 02:55:39.201404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.024 [2024-07-14 02:55:39.201430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.024 #8 NEW cov: 11664 ft: 12681 corp: 6/12b lim: 10 exec/s: 0 rss: 67Mb L: 3/3 MS: 2 EraseBytes-CrossOver- 00:07:44.024 [2024-07-14 02:55:39.241661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000044ff cdw11:00000000 00:07:44.024 [2024-07-14 02:55:39.241688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.024 [2024-07-14 02:55:39.241742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:44.024 [2024-07-14 02:55:39.241756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.024 #12 NEW cov: 11664 ft: 12973 corp: 7/17b lim: 10 exec/s: 0 rss: 67Mb L: 5/5 MS: 4 EraseBytes-ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:44.284 [2024-07-14 02:55:39.281642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.281668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.284 #13 NEW cov: 11664 ft: 13126 corp: 8/19b lim: 10 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:44.284 [2024-07-14 02:55:39.321733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.321759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.284 #14 NEW cov: 11664 ft: 13175 corp: 9/22b lim: 10 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 CopyPart- 00:07:44.284 [2024-07-14 02:55:39.361887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.361913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.284 #15 NEW cov: 11664 ft: 13254 corp: 10/25b lim: 10 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 ChangeBit- 00:07:44.284 [2024-07-14 02:55:39.402108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.402134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.284 [2024-07-14 02:55:39.402187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000b0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.402200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.284 #16 NEW cov: 11664 ft: 13370 corp: 11/29b lim: 10 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 CrossOver- 00:07:44.284 [2024-07-14 02:55:39.442133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007b0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.442159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.284 #17 NEW cov: 11664 ft: 13423 corp: 12/31b lim: 10 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeByte- 00:07:44.284 [2024-07-14 02:55:39.482361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.482387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.284 [2024-07-14 02:55:39.482445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004b0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.482459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.284 #18 NEW cov: 11664 ft: 13505 corp: 13/35b lim: 10 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 ChangeBit- 00:07:44.284 [2024-07-14 02:55:39.522496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.522521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.284 [2024-07-14 02:55:39.522574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.284 [2024-07-14 02:55:39.522587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.544 #19 NEW cov: 11664 ft: 13516 corp: 14/39b lim: 10 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 ShuffleBytes- 00:07:44.544 [2024-07-14 02:55:39.562811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.562836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.544 [2024-07-14 02:55:39.562890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008585 cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.562904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.544 [2024-07-14 02:55:39.562959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008585 cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.562973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.544 [2024-07-14 02:55:39.563024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00008585 cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.563038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.544 #20 NEW cov: 11664 ft: 13791 corp: 15/48b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:44.544 [2024-07-14 02:55:39.602764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.602790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.544 [2024-07-14 02:55:39.602842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000b11 cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.602856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.544 #21 NEW cov: 11664 ft: 13887 corp: 16/52b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 ChangeByte- 00:07:44.544 [2024-07-14 02:55:39.642686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a17 cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.642711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.544 #25 NEW cov: 11664 ft: 13912 corp: 17/54b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 4 EraseBytes-ShuffleBytes-CrossOver-InsertByte- 00:07:44.544 [2024-07-14 02:55:39.672804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a40 cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.672828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.544 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.544 #27 NEW cov: 11687 ft: 13920 corp: 18/57b lim: 10 exec/s: 0 rss: 68Mb L: 3/9 MS: 2 EraseBytes-CrossOver- 00:07:44.544 [2024-07-14 02:55:39.712935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.712960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.544 #28 NEW cov: 11687 ft: 14013 corp: 19/59b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 CopyPart- 00:07:44.544 [2024-07-14 02:55:39.753190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.753215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.544 [2024-07-14 02:55:39.753269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000b0e cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.753283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.544 #29 NEW cov: 11687 ft: 14018 corp: 20/63b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 ChangeBit- 00:07:44.544 [2024-07-14 02:55:39.783136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f017 cdw11:00000000 00:07:44.544 [2024-07-14 02:55:39.783162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.804 #30 NEW cov: 11687 ft: 14028 corp: 21/65b lim: 10 exec/s: 30 rss: 68Mb L: 2/9 MS: 1 ChangeBinInt- 00:07:44.804 [2024-07-14 02:55:39.823252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b90a cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.823278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.804 #31 NEW cov: 11687 ft: 14032 corp: 22/67b lim: 10 exec/s: 31 rss: 68Mb L: 2/9 MS: 1 ChangeByte- 00:07:44.804 [2024-07-14 02:55:39.863765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.863792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.804 [2024-07-14 02:55:39.863848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.863862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.804 [2024-07-14 02:55:39.863914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.863929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.804 [2024-07-14 02:55:39.863981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.863994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.804 #32 NEW cov: 11687 ft: 14065 corp: 23/75b lim: 10 exec/s: 32 rss: 68Mb L: 8/9 MS: 1 InsertRepeatedBytes- 00:07:44.804 [2024-07-14 02:55:39.903643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.903668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.804 [2024-07-14 02:55:39.903723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d011 cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.903737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.804 #33 NEW cov: 11687 ft: 14080 corp: 24/79b lim: 10 exec/s: 33 rss: 68Mb L: 4/9 MS: 1 ChangeByte- 00:07:44.804 [2024-07-14 02:55:39.943652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a28 cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.943678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.804 #34 NEW cov: 11687 ft: 14108 corp: 25/81b lim: 10 exec/s: 34 rss: 68Mb L: 2/9 MS: 1 InsertByte- 00:07:44.804 [2024-07-14 02:55:39.973725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000b0a cdw11:00000000 00:07:44.804 [2024-07-14 02:55:39.973750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.804 #35 NEW cov: 11687 ft: 14111 corp: 26/83b lim: 10 exec/s: 35 rss: 69Mb L: 2/9 MS: 1 CrossOver- 00:07:44.804 [2024-07-14 02:55:40.013854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.804 [2024-07-14 02:55:40.013880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.804 #36 NEW cov: 11687 ft: 14124 corp: 27/86b lim: 10 exec/s: 36 rss: 69Mb L: 3/9 MS: 1 InsertByte- 00:07:44.804 [2024-07-14 02:55:40.054021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aea cdw11:00000000 00:07:44.804 [2024-07-14 02:55:40.054047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.063 #37 NEW cov: 11687 ft: 14132 corp: 28/89b lim: 10 exec/s: 37 rss: 69Mb L: 3/9 MS: 1 InsertByte- 00:07:45.063 [2024-07-14 02:55:40.094145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d9f0 cdw11:00000000 00:07:45.063 [2024-07-14 02:55:40.094173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.063 #38 NEW cov: 11687 ft: 14141 corp: 29/92b lim: 10 exec/s: 38 rss: 69Mb L: 3/9 MS: 1 InsertByte- 00:07:45.063 [2024-07-14 02:55:40.134235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f817 cdw11:00000000 00:07:45.063 [2024-07-14 02:55:40.134261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.063 #39 NEW cov: 11687 ft: 14156 corp: 30/94b lim: 10 exec/s: 39 rss: 69Mb L: 2/9 MS: 1 ChangeBit- 00:07:45.063 [2024-07-14 02:55:40.164454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:45.063 [2024-07-14 02:55:40.164480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.063 [2024-07-14 02:55:40.164535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001b0e cdw11:00000000 00:07:45.063 [2024-07-14 02:55:40.164550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.063 #40 NEW cov: 11687 ft: 14200 corp: 31/98b lim: 10 exec/s: 40 rss: 69Mb L: 4/9 MS: 1 ChangeBit- 00:07:45.063 [2024-07-14 02:55:40.204439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:45.063 [2024-07-14 02:55:40.204468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.063 #42 NEW cov: 11687 ft: 14223 corp: 32/101b lim: 10 exec/s: 42 rss: 69Mb L: 3/9 MS: 2 EraseBytes-CrossOver- 00:07:45.063 [2024-07-14 02:55:40.244562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:45.064 [2024-07-14 02:55:40.244588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.064 #43 NEW cov: 11687 ft: 14236 corp: 33/104b lim: 10 exec/s: 43 rss: 69Mb L: 3/9 MS: 1 ChangeBit- 00:07:45.064 [2024-07-14 02:55:40.284644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:45.064 [2024-07-14 02:55:40.284671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.064 #44 NEW cov: 11687 ft: 14247 corp: 34/106b lim: 10 exec/s: 44 rss: 69Mb L: 2/9 MS: 1 EraseBytes- 00:07:45.322 [2024-07-14 02:55:40.324708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.324734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.322 #45 NEW cov: 11687 ft: 14292 corp: 35/109b lim: 10 exec/s: 45 rss: 69Mb L: 3/9 MS: 1 CrossOver- 00:07:45.322 [2024-07-14 02:55:40.364814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a28 cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.364841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.322 #46 NEW cov: 11687 ft: 14327 corp: 36/111b lim: 10 exec/s: 46 rss: 69Mb L: 2/9 MS: 1 ShuffleBytes- 00:07:45.322 [2024-07-14 02:55:40.405112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d90a cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.405141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.322 [2024-07-14 02:55:40.405196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.405211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.322 #47 NEW cov: 11687 ft: 14334 corp: 37/116b lim: 10 exec/s: 47 rss: 69Mb L: 5/9 MS: 1 CrossOver- 00:07:45.322 [2024-07-14 02:55:40.445208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.445235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.322 [2024-07-14 02:55:40.445289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006d0a cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.445306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.322 #48 NEW cov: 11687 ft: 14337 corp: 38/120b lim: 10 exec/s: 48 rss: 69Mb L: 4/9 MS: 1 ChangeByte- 00:07:45.322 [2024-07-14 02:55:40.485326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f7f5 cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.485352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.322 [2024-07-14 02:55:40.485406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000092f5 cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.485421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.322 #49 NEW cov: 11687 ft: 14348 corp: 39/124b lim: 10 exec/s: 49 rss: 69Mb L: 4/9 MS: 1 ChangeBinInt- 00:07:45.322 [2024-07-14 02:55:40.525399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.525425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.322 [2024-07-14 02:55:40.525485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000b11 cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.525500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.322 #50 NEW cov: 11687 ft: 14404 corp: 40/128b lim: 10 exec/s: 50 rss: 70Mb L: 4/9 MS: 1 ShuffleBytes- 00:07:45.322 [2024-07-14 02:55:40.565428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:45.322 [2024-07-14 02:55:40.565459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.582 #51 NEW cov: 11687 ft: 14423 corp: 41/131b lim: 10 exec/s: 51 rss: 70Mb L: 3/9 MS: 1 ShuffleBytes- 00:07:45.582 [2024-07-14 02:55:40.605670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:45.582 [2024-07-14 02:55:40.605697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.582 [2024-07-14 02:55:40.605750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:45.582 [2024-07-14 02:55:40.605766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.582 #52 NEW cov: 11687 ft: 14439 corp: 42/135b lim: 10 exec/s: 52 rss: 70Mb L: 4/9 MS: 1 CrossOver- 00:07:45.582 [2024-07-14 02:55:40.645655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:45.582 [2024-07-14 02:55:40.645681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.582 #53 NEW cov: 11687 ft: 14457 corp: 43/137b lim: 10 exec/s: 53 rss: 70Mb L: 2/9 MS: 1 ShuffleBytes- 00:07:45.582 [2024-07-14 02:55:40.685767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f017 cdw11:00000000 00:07:45.582 [2024-07-14 02:55:40.685794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.582 #54 NEW cov: 11687 ft: 14486 corp: 44/139b lim: 10 exec/s: 54 rss: 70Mb L: 2/9 MS: 1 CopyPart- 00:07:45.582 [2024-07-14 02:55:40.715856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4a cdw11:00000000 00:07:45.582 [2024-07-14 02:55:40.715882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.582 #55 NEW cov: 11687 ft: 14499 corp: 45/141b lim: 10 exec/s: 55 rss: 70Mb L: 2/9 MS: 1 ChangeBit- 00:07:45.582 [2024-07-14 02:55:40.755979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aab cdw11:00000000 00:07:45.582 [2024-07-14 02:55:40.756007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.582 #56 NEW cov: 11687 ft: 14535 corp: 46/143b lim: 10 exec/s: 56 rss: 70Mb L: 2/9 MS: 1 InsertByte- 00:07:45.582 [2024-07-14 02:55:40.786035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000b0a cdw11:00000000 00:07:45.582 [2024-07-14 02:55:40.786061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.582 #57 NEW cov: 11687 ft: 14563 corp: 47/145b lim: 10 exec/s: 28 rss: 70Mb L: 2/9 MS: 1 CopyPart- 00:07:45.582 #57 DONE cov: 11687 ft: 14563 corp: 47/145b lim: 10 exec/s: 28 rss: 70Mb 00:07:45.582 Done 57 runs in 2 second(s) 00:07:45.842 02:55:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:45.842 02:55:40 -- ../common.sh@72 -- # (( i++ )) 00:07:45.842 02:55:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.842 02:55:40 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:45.842 02:55:40 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:45.842 02:55:40 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.842 02:55:40 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.842 02:55:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:45.842 02:55:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:45.842 02:55:40 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:45.842 02:55:40 -- nvmf/run.sh@29 -- # port=4407 00:07:45.842 02:55:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:45.842 02:55:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:45.842 02:55:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.842 02:55:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:45.842 [2024-07-14 02:55:40.968745] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:45.842 [2024-07-14 02:55:40.968815] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid669737 ] 00:07:45.842 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.100 [2024-07-14 02:55:41.218262] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.100 [2024-07-14 02:55:41.247023] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:46.100 [2024-07-14 02:55:41.247142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.100 [2024-07-14 02:55:41.298514] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.100 [2024-07-14 02:55:41.314809] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:46.100 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.100 INFO: Seed: 2807141460 00:07:46.100 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:46.100 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:46.100 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:46.100 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.100 #2 INITED exec/s: 0 rss: 59Mb 00:07:46.100 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.100 This may also happen if the target rejected all inputs we tried so far 00:07:46.357 [2024-07-14 02:55:41.364201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.357 [2024-07-14 02:55:41.364230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.357 [2024-07-14 02:55:41.364288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.357 [2024-07-14 02:55:41.364302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.357 [2024-07-14 02:55:41.364354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.357 [2024-07-14 02:55:41.364368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.358 [2024-07-14 02:55:41.364420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.358 [2024-07-14 02:55:41.364433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.615 NEW_FUNC[1/669]: 0x49ccb0 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:46.615 NEW_FUNC[2/669]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.615 #3 NEW cov: 11460 ft: 11461 corp: 2/10b lim: 10 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:46.615 [2024-07-14 02:55:41.664862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.664901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.664956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.664975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.665030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.665047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.665100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.665117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.615 #4 NEW cov: 11573 ft: 11930 corp: 3/19b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 CopyPart- 00:07:46.615 [2024-07-14 02:55:41.714874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.714900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.714951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.714964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.715010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006000 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.715024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.715072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000960 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.715084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.615 #5 NEW cov: 11579 ft: 12098 corp: 4/28b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:46.615 [2024-07-14 02:55:41.755010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.755037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.755085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.755098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.755147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.755161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.755208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.755221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.615 #6 NEW cov: 11664 ft: 12517 corp: 5/37b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ShuffleBytes- 00:07:46.615 [2024-07-14 02:55:41.795185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.795210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.795259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.795272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.795320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.795334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.795383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.795395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.795446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.795460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.615 #7 NEW cov: 11664 ft: 12624 corp: 6/47b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CrossOver- 00:07:46.615 [2024-07-14 02:55:41.835199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.835224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.835266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.835279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.835326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.835340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.615 [2024-07-14 02:55:41.835384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.615 [2024-07-14 02:55:41.835396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.615 #8 NEW cov: 11664 ft: 12688 corp: 7/56b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:46.873 [2024-07-14 02:55:41.874958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000609f cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.874983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.873 #11 NEW cov: 11664 ft: 13067 corp: 8/58b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 3 ChangeByte-CopyPart-InsertByte- 00:07:46.873 [2024-07-14 02:55:41.915419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.915451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.873 [2024-07-14 02:55:41.915500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.915514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.873 [2024-07-14 02:55:41.915561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.915574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.873 [2024-07-14 02:55:41.915622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.915634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.873 #12 NEW cov: 11664 ft: 13107 corp: 9/67b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 CopyPart- 00:07:46.873 [2024-07-14 02:55:41.955563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.955589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.873 [2024-07-14 02:55:41.955638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.955651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.873 [2024-07-14 02:55:41.955698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.955712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.873 [2024-07-14 02:55:41.955760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006009 cdw11:00000000 00:07:46.873 [2024-07-14 02:55:41.955771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.873 #13 NEW cov: 11664 ft: 13135 corp: 10/76b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:46.873 [2024-07-14 02:55:41.995786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:41.995812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:41.995861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:41.995875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:41.995907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:46.874 [2024-07-14 02:55:41.995924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:41.995969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006009 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:41.995982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:41.996027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006021 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:41.996041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.874 #14 NEW cov: 11664 ft: 13173 corp: 11/86b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertByte- 00:07:46.874 [2024-07-14 02:55:42.035746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.035771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.035822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.035835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.035884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.035897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.035944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fd60 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.035957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.874 #15 NEW cov: 11664 ft: 13202 corp: 12/95b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 ChangeByte- 00:07:46.874 [2024-07-14 02:55:42.076013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.076039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.076087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.076100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.076148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.076162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.076208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.076221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.076269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.076283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.874 #16 NEW cov: 11664 ft: 13210 corp: 13/105b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:07:46.874 [2024-07-14 02:55:42.115971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.115997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.116056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.116070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.116118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.116132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.874 [2024-07-14 02:55:42.116179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:46.874 [2024-07-14 02:55:42.116192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.133 #17 NEW cov: 11664 ft: 13258 corp: 14/114b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 CopyPart- 00:07:47.133 [2024-07-14 02:55:42.156237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.156262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.156311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.156324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.156371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.156385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.156432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007b09 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.156450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.156500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006021 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.156513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.133 #18 NEW cov: 11664 ft: 13281 corp: 15/124b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:07:47.133 [2024-07-14 02:55:42.196243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.196269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.196317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.196330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.196378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.196392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.196439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fd60 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.196456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.133 #19 NEW cov: 11664 ft: 13321 corp: 16/133b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 CMP- DE: "\377\377\377\000"- 00:07:47.133 [2024-07-14 02:55:42.236068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.236097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:47.133 #20 NEW cov: 11687 ft: 13380 corp: 17/136b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 1 CrossOver- 00:07:47.133 [2024-07-14 02:55:42.276553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.276578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.276626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.276639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.276686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.276699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.276744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006f00 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.276757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.276801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.276815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.133 #21 NEW cov: 11687 ft: 13405 corp: 18/146b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:07:47.133 [2024-07-14 02:55:42.306654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.306680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.306725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.306738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.306786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.306799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.306844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.306857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.306902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.306916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.133 #22 NEW cov: 11687 ft: 13493 corp: 19/156b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:07:47.133 [2024-07-14 02:55:42.336680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.336705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.336753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.336769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.336813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003f60 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.336827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.336873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.336885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.133 #23 NEW cov: 11687 ft: 13503 corp: 20/165b lim: 10 exec/s: 23 rss: 68Mb L: 9/10 MS: 1 ChangeByte- 00:07:47.133 [2024-07-14 02:55:42.366792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.366817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.366865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.366878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.366926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003f2e cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.366940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.133 [2024-07-14 02:55:42.366988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.133 [2024-07-14 02:55:42.367001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.393 #24 NEW cov: 11687 ft: 13515 corp: 21/174b lim: 10 exec/s: 24 rss: 68Mb L: 9/10 MS: 1 ChangeByte- 00:07:47.393 [2024-07-14 02:55:42.406790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.406815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.406864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.406878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.406926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.406939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.393 #25 NEW cov: 11687 ft: 13685 corp: 22/181b lim: 10 exec/s: 25 rss: 68Mb L: 7/10 MS: 1 EraseBytes- 00:07:47.393 [2024-07-14 02:55:42.447071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.447097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.447145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.447159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.447206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.447223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.447270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.447283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.447331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000fd60 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.447344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.393 #26 NEW cov: 11687 ft: 13755 corp: 23/191b lim: 10 exec/s: 26 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:07:47.393 [2024-07-14 02:55:42.487096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.487120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.487169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.487182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.487231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005d60 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.487245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.487294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.487307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.393 #27 NEW cov: 11687 ft: 13781 corp: 24/200b lim: 10 exec/s: 27 rss: 68Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:47.393 [2024-07-14 02:55:42.517328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.517353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.393 [2024-07-14 02:55:42.517403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.393 [2024-07-14 02:55:42.517416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.517467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.517481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.517541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007b00 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.517553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.517600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006021 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.517613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.394 #28 NEW cov: 11687 ft: 13823 corp: 25/210b lim: 10 exec/s: 28 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:47.394 [2024-07-14 02:55:42.557423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.557452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.557503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.557516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.557565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.557580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.557627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007900 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.557641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.557689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006021 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.557703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.394 #29 NEW cov: 11687 ft: 13848 corp: 26/220b lim: 10 exec/s: 29 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:07:47.394 [2024-07-14 02:55:42.597340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a60 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.597365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.597414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.597427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.597479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005d60 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.597493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.597543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.597556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.394 #30 NEW cov: 11687 ft: 13855 corp: 27/229b lim: 10 exec/s: 30 rss: 69Mb L: 9/10 MS: 1 ChangeBit- 00:07:47.394 [2024-07-14 02:55:42.637537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.637563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.637613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.637627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.637674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000060fe cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.637687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.394 [2024-07-14 02:55:42.637738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.394 [2024-07-14 02:55:42.637751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.654 #31 NEW cov: 11687 ft: 13867 corp: 28/238b lim: 10 exec/s: 31 rss: 69Mb L: 9/10 MS: 1 ChangeByte- 00:07:47.654 [2024-07-14 02:55:42.677739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.677765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.677814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.677827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.677875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.677889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.677936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.677948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.677998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.678011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.654 #32 NEW cov: 11687 ft: 13872 corp: 29/248b lim: 10 exec/s: 32 rss: 69Mb L: 10/10 MS: 1 PersAutoDict- DE: "\377\377\377\000"- 00:07:47.654 [2024-07-14 02:55:42.717655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.717681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.717728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006000 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.717741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.717788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.717801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.654 #33 NEW cov: 11687 ft: 13878 corp: 30/255b lim: 10 exec/s: 33 rss: 69Mb L: 7/10 MS: 1 CrossOver- 00:07:47.654 [2024-07-14 02:55:42.757880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.757905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.757954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002160 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.757968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.758016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005d60 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.758029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.758077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.758091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.654 #34 NEW cov: 11687 ft: 13904 corp: 31/264b lim: 10 exec/s: 34 rss: 69Mb L: 9/10 MS: 1 ChangeByte- 00:07:47.654 [2024-07-14 02:55:42.787945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.787973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.788022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.788035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.788083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.788097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.788143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.788155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.654 #35 NEW cov: 11687 ft: 13941 corp: 32/272b lim: 10 exec/s: 35 rss: 69Mb L: 8/10 MS: 1 CrossOver- 00:07:47.654 [2024-07-14 02:55:42.828119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.828144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.828193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.828206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.828255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.828269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.828316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.828329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.654 #36 NEW cov: 11687 ft: 13944 corp: 33/281b lim: 10 exec/s: 36 rss: 69Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:47.654 [2024-07-14 02:55:42.858297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.858321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.858370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.858383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.858432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.858451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.858498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.858511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.858559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.858572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.654 #37 NEW cov: 11687 ft: 13955 corp: 34/291b lim: 10 exec/s: 37 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:47.654 [2024-07-14 02:55:42.898423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.898454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.898504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.898517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.898565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.898579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.898626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.898639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.654 [2024-07-14 02:55:42.898687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.654 [2024-07-14 02:55:42.898701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.914 #38 NEW cov: 11687 ft: 13963 corp: 35/301b lim: 10 exec/s: 38 rss: 69Mb L: 10/10 MS: 1 PersAutoDict- DE: "\377\377\377\000"- 00:07:47.914 [2024-07-14 02:55:42.938408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:42.938434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:42.938489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:42.938503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:42.938552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003f2a cdw11:00000000 00:07:47.914 [2024-07-14 02:55:42.938566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:42.938614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:42.938626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.914 #39 NEW cov: 11687 ft: 13985 corp: 36/310b lim: 10 exec/s: 39 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:47.914 [2024-07-14 02:55:42.978556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:42.978581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:42.978632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:42.978645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:42.978693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006064 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:42.978706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:42.978753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:42.978769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.914 #40 NEW cov: 11687 ft: 13988 corp: 37/319b lim: 10 exec/s: 40 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:47.914 [2024-07-14 02:55:43.008637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.008662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:43.008695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002160 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.008708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:43.008754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005d60 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.008768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:43.008811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.008825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.914 #41 NEW cov: 11687 ft: 14048 corp: 38/328b lim: 10 exec/s: 41 rss: 70Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:47.914 [2024-07-14 02:55:43.048743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.048768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:43.048813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002160 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.048825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:43.048871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005d60 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.048885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:43.048929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.048942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.914 #42 NEW cov: 11687 ft: 14051 corp: 39/337b lim: 10 exec/s: 42 rss: 70Mb L: 9/10 MS: 1 CMP- DE: "\000\014"- 00:07:47.914 [2024-07-14 02:55:43.078610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.078635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:43.078682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.078695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.914 #43 NEW cov: 11687 ft: 14192 corp: 40/342b lim: 10 exec/s: 43 rss: 70Mb L: 5/10 MS: 1 EraseBytes- 00:07:47.914 [2024-07-14 02:55:43.119045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000600a cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.119070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.914 [2024-07-14 02:55:43.119117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.914 [2024-07-14 02:55:43.119133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.915 [2024-07-14 02:55:43.119179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000093f cdw11:00000000 00:07:47.915 [2024-07-14 02:55:43.119193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.915 [2024-07-14 02:55:43.119239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002a00 cdw11:00000000 00:07:47.915 [2024-07-14 02:55:43.119252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.915 [2024-07-14 02:55:43.119297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:47.915 [2024-07-14 02:55:43.119311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.915 #44 NEW cov: 11687 ft: 14208 corp: 41/352b lim: 10 exec/s: 44 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:07:47.915 [2024-07-14 02:55:43.159077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:47.915 [2024-07-14 02:55:43.159101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.915 [2024-07-14 02:55:43.159150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:47.915 [2024-07-14 02:55:43.159163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.915 [2024-07-14 02:55:43.159208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003f60 cdw11:00000000 00:07:47.915 [2024-07-14 02:55:43.159221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.915 [2024-07-14 02:55:43.159267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:47.915 [2024-07-14 02:55:43.159279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.174 #45 NEW cov: 11687 ft: 14217 corp: 42/361b lim: 10 exec/s: 45 rss: 70Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:48.174 [2024-07-14 02:55:43.199216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:48.174 [2024-07-14 02:55:43.199242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.174 [2024-07-14 02:55:43.199288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:48.174 [2024-07-14 02:55:43.199302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.199351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006064 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.199364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.199410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000960 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.199423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.175 #46 NEW cov: 11687 ft: 14230 corp: 43/370b lim: 10 exec/s: 46 rss: 70Mb L: 9/10 MS: 1 CopyPart- 00:07:48.175 [2024-07-14 02:55:43.239398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.239426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.239479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006009 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.239493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.239541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.239554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.239602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.239615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.239663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000603b cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.239676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.175 #47 NEW cov: 11687 ft: 14240 corp: 44/380b lim: 10 exec/s: 47 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:07:48.175 [2024-07-14 02:55:43.269377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.269402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.269456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000060ff cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.269470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.269517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.269531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.269578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.269591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.175 #48 NEW cov: 11687 ft: 14263 corp: 45/389b lim: 10 exec/s: 48 rss: 70Mb L: 9/10 MS: 1 PersAutoDict- DE: "\377\377\377\000"- 00:07:48.175 [2024-07-14 02:55:43.299359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006060 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.299383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.299429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000600a cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.299446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.299507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000600a cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.299521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.175 #49 NEW cov: 11687 ft: 14269 corp: 46/396b lim: 10 exec/s: 49 rss: 70Mb L: 7/10 MS: 1 CopyPart- 00:07:48.175 [2024-07-14 02:55:43.339677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.339702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.339752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.339765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.339811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.339824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.339870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.339883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.175 [2024-07-14 02:55:43.339928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:48.175 [2024-07-14 02:55:43.339942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.175 #50 NEW cov: 11687 ft: 14287 corp: 47/406b lim: 10 exec/s: 25 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:48.175 #50 DONE cov: 11687 ft: 14287 corp: 47/406b lim: 10 exec/s: 25 rss: 70Mb 00:07:48.175 ###### Recommended dictionary. ###### 00:07:48.175 "\377\377\377\000" # Uses: 3 00:07:48.175 "\000\014" # Uses: 0 00:07:48.175 ###### End of recommended dictionary. ###### 00:07:48.175 Done 50 runs in 2 second(s) 00:07:48.435 02:55:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:48.435 02:55:43 -- ../common.sh@72 -- # (( i++ )) 00:07:48.435 02:55:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.435 02:55:43 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:48.435 02:55:43 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:48.435 02:55:43 -- nvmf/run.sh@24 -- # local timen=1 00:07:48.435 02:55:43 -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.435 02:55:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:48.435 02:55:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:48.435 02:55:43 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:48.435 02:55:43 -- nvmf/run.sh@29 -- # port=4408 00:07:48.435 02:55:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:48.435 02:55:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:48.435 02:55:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.435 02:55:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:48.435 [2024-07-14 02:55:43.512579] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:48.435 [2024-07-14 02:55:43.512657] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid670033 ] 00:07:48.435 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.695 [2024-07-14 02:55:43.765707] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.695 [2024-07-14 02:55:43.791300] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.695 [2024-07-14 02:55:43.791418] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.695 [2024-07-14 02:55:43.842795] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.695 [2024-07-14 02:55:43.859085] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:48.695 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.695 INFO: Seed: 1057175702 00:07:48.695 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:48.695 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:48.695 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:48.695 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.695 [2024-07-14 02:55:43.925941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.695 [2024-07-14 02:55:43.925981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.695 #2 INITED cov: 11476 ft: 11488 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:48.955 [2024-07-14 02:55:43.976365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:43.976393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.955 [2024-07-14 02:55:43.976463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:43.976479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.955 #3 NEW cov: 11601 ft: 12643 corp: 2/3b lim: 5 exec/s: 0 rss: 65Mb L: 2/2 MS: 1 CopyPart- 00:07:48.955 [2024-07-14 02:55:44.036495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.036524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.955 [2024-07-14 02:55:44.036594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.036608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.955 #4 NEW cov: 11607 ft: 12950 corp: 3/5b lim: 5 exec/s: 0 rss: 65Mb L: 2/2 MS: 1 ChangeBinInt- 00:07:48.955 [2024-07-14 02:55:44.087490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.087517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.955 [2024-07-14 02:55:44.087615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.087631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.955 [2024-07-14 02:55:44.087703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.087717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.955 [2024-07-14 02:55:44.087784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.087798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.955 #5 NEW cov: 11692 ft: 13543 corp: 4/9b lim: 5 exec/s: 0 rss: 65Mb L: 4/4 MS: 1 CopyPart- 00:07:48.955 [2024-07-14 02:55:44.146925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.146955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.955 [2024-07-14 02:55:44.147041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.147057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.955 #6 NEW cov: 11692 ft: 13638 corp: 5/11b lim: 5 exec/s: 0 rss: 65Mb L: 2/4 MS: 1 ChangeByte- 00:07:48.955 [2024-07-14 02:55:44.197070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.197098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.955 [2024-07-14 02:55:44.197173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.955 [2024-07-14 02:55:44.197188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.215 #7 NEW cov: 11692 ft: 13754 corp: 6/13b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 CopyPart- 00:07:49.215 [2024-07-14 02:55:44.257593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.257621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.215 [2024-07-14 02:55:44.257709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.257725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.215 [2024-07-14 02:55:44.257799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.257813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.215 #8 NEW cov: 11692 ft: 13939 corp: 7/16b lim: 5 exec/s: 0 rss: 66Mb L: 3/4 MS: 1 CopyPart- 00:07:49.215 [2024-07-14 02:55:44.317420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.317451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.215 [2024-07-14 02:55:44.317556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.317572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.215 #9 NEW cov: 11692 ft: 14027 corp: 8/18b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 ChangeByte- 00:07:49.215 [2024-07-14 02:55:44.367423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.367455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.215 [2024-07-14 02:55:44.367522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.367537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.215 #10 NEW cov: 11692 ft: 14101 corp: 9/20b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 ChangeByte- 00:07:49.215 [2024-07-14 02:55:44.417768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.417795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.215 [2024-07-14 02:55:44.417865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.215 [2024-07-14 02:55:44.417882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.215 #11 NEW cov: 11692 ft: 14130 corp: 10/22b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 InsertByte- 00:07:49.475 [2024-07-14 02:55:44.468550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.468580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.475 [2024-07-14 02:55:44.468674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.468690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.475 [2024-07-14 02:55:44.468764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.468790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.475 [2024-07-14 02:55:44.468863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.468877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.475 #12 NEW cov: 11692 ft: 14193 corp: 11/26b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertByte- 00:07:49.475 [2024-07-14 02:55:44.527763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.527791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.475 #13 NEW cov: 11692 ft: 14272 corp: 12/27b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 EraseBytes- 00:07:49.475 [2024-07-14 02:55:44.578303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.578330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.475 [2024-07-14 02:55:44.578400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.578415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.475 #14 NEW cov: 11692 ft: 14303 corp: 13/29b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 CopyPart- 00:07:49.475 [2024-07-14 02:55:44.628464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.628491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.475 [2024-07-14 02:55:44.628565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.628581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.475 #15 NEW cov: 11692 ft: 14380 corp: 14/31b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 InsertByte- 00:07:49.475 [2024-07-14 02:55:44.678321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.475 [2024-07-14 02:55:44.678348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.475 #16 NEW cov: 11692 ft: 14466 corp: 15/32b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 ChangeByte- 00:07:49.735 [2024-07-14 02:55:44.729507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.729535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.735 [2024-07-14 02:55:44.729628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.729644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.735 [2024-07-14 02:55:44.729714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.729729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.735 [2024-07-14 02:55:44.729801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.729816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.735 #17 NEW cov: 11692 ft: 14509 corp: 16/36b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 CopyPart- 00:07:49.735 [2024-07-14 02:55:44.789954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.789983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.735 [2024-07-14 02:55:44.790055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.790071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.735 [2024-07-14 02:55:44.790138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.790153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.735 [2024-07-14 02:55:44.790227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.790243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.735 [2024-07-14 02:55:44.790315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.735 [2024-07-14 02:55:44.790330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.995 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.995 #18 NEW cov: 11715 ft: 14593 corp: 17/41b lim: 5 exec/s: 18 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:07:49.995 [2024-07-14 02:55:45.119681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.119727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.995 [2024-07-14 02:55:45.119857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.119878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.995 [2024-07-14 02:55:45.120012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.120031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.995 [2024-07-14 02:55:45.120159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.120180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.995 #19 NEW cov: 11715 ft: 14757 corp: 18/45b lim: 5 exec/s: 19 rss: 68Mb L: 4/5 MS: 1 ChangeByte- 00:07:49.995 [2024-07-14 02:55:45.169110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.169140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.995 [2024-07-14 02:55:45.169236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.169254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.995 #20 NEW cov: 11715 ft: 14825 corp: 19/47b lim: 5 exec/s: 20 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:49.995 [2024-07-14 02:55:45.209990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.210020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.995 [2024-07-14 02:55:45.210146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.210162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.995 [2024-07-14 02:55:45.210283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.210298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.995 [2024-07-14 02:55:45.210428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.210448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.995 [2024-07-14 02:55:45.210568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.995 [2024-07-14 02:55:45.210587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.995 #21 NEW cov: 11715 ft: 14844 corp: 20/52b lim: 5 exec/s: 21 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:07:50.254 [2024-07-14 02:55:45.259141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.254 [2024-07-14 02:55:45.259171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.254 #22 NEW cov: 11715 ft: 14888 corp: 21/53b lim: 5 exec/s: 22 rss: 68Mb L: 1/5 MS: 1 EraseBytes- 00:07:50.254 [2024-07-14 02:55:45.300029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.254 [2024-07-14 02:55:45.300058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.254 [2024-07-14 02:55:45.300175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.254 [2024-07-14 02:55:45.300192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.254 [2024-07-14 02:55:45.300308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.254 [2024-07-14 02:55:45.300324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.254 [2024-07-14 02:55:45.300448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.254 [2024-07-14 02:55:45.300464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.254 #23 NEW cov: 11715 ft: 14918 corp: 22/57b lim: 5 exec/s: 23 rss: 68Mb L: 4/5 MS: 1 CrossOver- 00:07:50.254 [2024-07-14 02:55:45.339677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.254 [2024-07-14 02:55:45.339706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.254 [2024-07-14 02:55:45.339821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.254 [2024-07-14 02:55:45.339838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.254 #24 NEW cov: 11715 ft: 14930 corp: 23/59b lim: 5 exec/s: 24 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:50.254 [2024-07-14 02:55:45.379476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.254 [2024-07-14 02:55:45.379504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.254 #25 NEW cov: 11715 ft: 14945 corp: 24/60b lim: 5 exec/s: 25 rss: 68Mb L: 1/5 MS: 1 EraseBytes- 00:07:50.255 [2024-07-14 02:55:45.420121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.255 [2024-07-14 02:55:45.420150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.255 [2024-07-14 02:55:45.420273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.255 [2024-07-14 02:55:45.420290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.255 [2024-07-14 02:55:45.420416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.255 [2024-07-14 02:55:45.420434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.255 #26 NEW cov: 11715 ft: 14960 corp: 25/63b lim: 5 exec/s: 26 rss: 68Mb L: 3/5 MS: 1 CrossOver- 00:07:50.255 [2024-07-14 02:55:45.460059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.255 [2024-07-14 02:55:45.460087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.255 [2024-07-14 02:55:45.460203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.255 [2024-07-14 02:55:45.460221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.255 #27 NEW cov: 11715 ft: 14976 corp: 26/65b lim: 5 exec/s: 27 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:07:50.255 [2024-07-14 02:55:45.500124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.255 [2024-07-14 02:55:45.500153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.255 [2024-07-14 02:55:45.500280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.255 [2024-07-14 02:55:45.500298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.513 #28 NEW cov: 11715 ft: 14989 corp: 27/67b lim: 5 exec/s: 28 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:50.513 [2024-07-14 02:55:45.550004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.513 [2024-07-14 02:55:45.550032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.513 #29 NEW cov: 11715 ft: 15010 corp: 28/68b lim: 5 exec/s: 29 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:07:50.513 [2024-07-14 02:55:45.590909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.513 [2024-07-14 02:55:45.590939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.513 [2024-07-14 02:55:45.591064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.513 [2024-07-14 02:55:45.591080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.513 [2024-07-14 02:55:45.591201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.513 [2024-07-14 02:55:45.591218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.513 [2024-07-14 02:55:45.591346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.513 [2024-07-14 02:55:45.591363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.513 #30 NEW cov: 11715 ft: 15076 corp: 29/72b lim: 5 exec/s: 30 rss: 68Mb L: 4/5 MS: 1 CopyPart- 00:07:50.513 [2024-07-14 02:55:45.640646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.513 [2024-07-14 02:55:45.640674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.513 [2024-07-14 02:55:45.640795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.513 [2024-07-14 02:55:45.640814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.513 #31 NEW cov: 11715 ft: 15082 corp: 30/74b lim: 5 exec/s: 31 rss: 68Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:50.513 [2024-07-14 02:55:45.680424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.514 [2024-07-14 02:55:45.680456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.514 #32 NEW cov: 11715 ft: 15095 corp: 31/75b lim: 5 exec/s: 32 rss: 68Mb L: 1/5 MS: 1 EraseBytes- 00:07:50.514 [2024-07-14 02:55:45.721325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.514 [2024-07-14 02:55:45.721355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.514 [2024-07-14 02:55:45.721470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.514 [2024-07-14 02:55:45.721488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.514 [2024-07-14 02:55:45.721609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.514 [2024-07-14 02:55:45.721626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.514 [2024-07-14 02:55:45.721744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.514 [2024-07-14 02:55:45.721760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.514 #33 NEW cov: 11715 ft: 15105 corp: 32/79b lim: 5 exec/s: 33 rss: 68Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:50.514 [2024-07-14 02:55:45.760976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.514 [2024-07-14 02:55:45.761004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.514 [2024-07-14 02:55:45.761138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.514 [2024-07-14 02:55:45.761153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.772 #34 NEW cov: 11715 ft: 15109 corp: 33/81b lim: 5 exec/s: 34 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:50.772 [2024-07-14 02:55:45.801373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.772 [2024-07-14 02:55:45.801400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.772 [2024-07-14 02:55:45.801522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.801541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.773 [2024-07-14 02:55:45.801663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.801678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.773 #35 NEW cov: 11715 ft: 15113 corp: 34/84b lim: 5 exec/s: 35 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:50.773 [2024-07-14 02:55:45.841903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.841930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.773 [2024-07-14 02:55:45.842053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.842070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.773 [2024-07-14 02:55:45.842158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.842174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.773 [2024-07-14 02:55:45.842301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.842316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.773 [2024-07-14 02:55:45.842437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.842457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.773 #36 NEW cov: 11715 ft: 15133 corp: 35/89b lim: 5 exec/s: 36 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:07:50.773 [2024-07-14 02:55:45.891888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.891914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.773 [2024-07-14 02:55:45.892029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.892062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.773 [2024-07-14 02:55:45.892170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.892186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.773 [2024-07-14 02:55:45.892306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.773 [2024-07-14 02:55:45.892321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.773 #37 NEW cov: 11715 ft: 15157 corp: 36/93b lim: 5 exec/s: 18 rss: 69Mb L: 4/5 MS: 1 CopyPart- 00:07:50.773 #37 DONE cov: 11715 ft: 15157 corp: 36/93b lim: 5 exec/s: 18 rss: 69Mb 00:07:50.773 Done 37 runs in 2 second(s) 00:07:50.773 02:55:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:51.032 02:55:46 -- ../common.sh@72 -- # (( i++ )) 00:07:51.032 02:55:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.032 02:55:46 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:51.032 02:55:46 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:51.032 02:55:46 -- nvmf/run.sh@24 -- # local timen=1 00:07:51.032 02:55:46 -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.032 02:55:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:51.032 02:55:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:51.032 02:55:46 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:51.032 02:55:46 -- nvmf/run.sh@29 -- # port=4409 00:07:51.032 02:55:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:51.032 02:55:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:51.032 02:55:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.032 02:55:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:51.032 [2024-07-14 02:55:46.067366] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:51.032 [2024-07-14 02:55:46.067466] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid670570 ] 00:07:51.032 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.032 [2024-07-14 02:55:46.243048] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.032 [2024-07-14 02:55:46.262418] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:51.032 [2024-07-14 02:55:46.262541] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.291 [2024-07-14 02:55:46.314047] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.291 [2024-07-14 02:55:46.330315] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:51.291 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.291 INFO: Seed: 3529175940 00:07:51.291 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:51.291 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:51.291 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:51.291 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.291 [2024-07-14 02:55:46.374989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.291 [2024-07-14 02:55:46.375022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.291 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 64Mb 00:07:51.291 [2024-07-14 02:55:46.424984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.291 [2024-07-14 02:55:46.425015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.291 #3 NEW cov: 11601 ft: 12179 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeBinInt- 00:07:51.291 [2024-07-14 02:55:46.485324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.291 [2024-07-14 02:55:46.485357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.291 [2024-07-14 02:55:46.485391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.291 [2024-07-14 02:55:46.485411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.291 [2024-07-14 02:55:46.485440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.291 [2024-07-14 02:55:46.485463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.291 [2024-07-14 02:55:46.485492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.291 [2024-07-14 02:55:46.485508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.291 #4 NEW cov: 11607 ft: 13136 corp: 3/6b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:51.552 [2024-07-14 02:55:46.545560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.545593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.545626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.545643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.545672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.545689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.545717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.545733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.545761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.545776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.552 #5 NEW cov: 11692 ft: 13553 corp: 4/11b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertByte- 00:07:51.552 [2024-07-14 02:55:46.605709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.605741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.605783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.605798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.605826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.605842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.605869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.605887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.605914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.605928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.552 #6 NEW cov: 11692 ft: 13630 corp: 5/16b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertByte- 00:07:51.552 [2024-07-14 02:55:46.655624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.655656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.655687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.655702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.552 #7 NEW cov: 11692 ft: 13876 corp: 6/18b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 InsertByte- 00:07:51.552 [2024-07-14 02:55:46.705887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.705918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.705951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.705966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.705993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.706008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.706035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.706049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.706076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.706091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.552 #8 NEW cov: 11692 ft: 13968 corp: 7/23b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 CopyPart- 00:07:51.552 [2024-07-14 02:55:46.766067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.766100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.766133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.766149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.766182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.766198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.552 [2024-07-14 02:55:46.766226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.552 [2024-07-14 02:55:46.766242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.812 #9 NEW cov: 11692 ft: 13985 corp: 8/27b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 EraseBytes- 00:07:51.812 [2024-07-14 02:55:46.826222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.826254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.812 [2024-07-14 02:55:46.826287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.826303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.812 [2024-07-14 02:55:46.826332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.826349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.812 [2024-07-14 02:55:46.826377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.826393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.812 #10 NEW cov: 11692 ft: 14079 corp: 9/31b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 EraseBytes- 00:07:51.812 [2024-07-14 02:55:46.876134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.876165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.812 #11 NEW cov: 11692 ft: 14130 corp: 10/32b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:51.812 [2024-07-14 02:55:46.926273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.926304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.812 #12 NEW cov: 11692 ft: 14179 corp: 11/33b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:51.812 [2024-07-14 02:55:46.986688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.986719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.812 [2024-07-14 02:55:46.986751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.986766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.812 [2024-07-14 02:55:46.986793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.986812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.812 [2024-07-14 02:55:46.986840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.812 [2024-07-14 02:55:46.986854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.812 [2024-07-14 02:55:46.986881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.813 [2024-07-14 02:55:46.986896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.813 #13 NEW cov: 11692 ft: 14205 corp: 12/38b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:51.813 [2024-07-14 02:55:47.036789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.813 [2024-07-14 02:55:47.036820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.813 [2024-07-14 02:55:47.036852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.813 [2024-07-14 02:55:47.036867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.813 [2024-07-14 02:55:47.036894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.813 [2024-07-14 02:55:47.036909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.813 [2024-07-14 02:55:47.036936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.813 [2024-07-14 02:55:47.036950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.813 [2024-07-14 02:55:47.036977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.813 [2024-07-14 02:55:47.036991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.072 #14 NEW cov: 11692 ft: 14221 corp: 13/43b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:52.072 [2024-07-14 02:55:47.096921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.096952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.096983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.096998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.097026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.097041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.097067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.097082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.072 #15 NEW cov: 11692 ft: 14253 corp: 14/47b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 ChangeBit- 00:07:52.072 [2024-07-14 02:55:47.147041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.147073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.147104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.147120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.147147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.147163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.147190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.147204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.072 #16 NEW cov: 11692 ft: 14262 corp: 15/51b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 ChangeBit- 00:07:52.072 [2024-07-14 02:55:47.197127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.197157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.197189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.197204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.197231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.197245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.072 #17 NEW cov: 11692 ft: 14424 corp: 16/54b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 EraseBytes- 00:07:52.072 [2024-07-14 02:55:47.247255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.247286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.247317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.247332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.072 [2024-07-14 02:55:47.247359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.072 [2024-07-14 02:55:47.247374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.332 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:52.332 #18 NEW cov: 11715 ft: 14470 corp: 17/57b lim: 5 exec/s: 18 rss: 68Mb L: 3/5 MS: 1 ChangeBinInt- 00:07:52.332 [2024-07-14 02:55:47.578122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.332 [2024-07-14 02:55:47.578160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 #19 NEW cov: 11715 ft: 14571 corp: 18/58b lim: 5 exec/s: 19 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:07:52.593 [2024-07-14 02:55:47.638242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.638272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.638304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.638318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.593 #20 NEW cov: 11715 ft: 14575 corp: 19/60b lim: 5 exec/s: 20 rss: 69Mb L: 2/5 MS: 1 EraseBytes- 00:07:52.593 [2024-07-14 02:55:47.698517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.698548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.698578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.698594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.698621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.698636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.698663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.698678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.593 #21 NEW cov: 11715 ft: 14591 corp: 20/64b lim: 5 exec/s: 21 rss: 69Mb L: 4/5 MS: 1 CrossOver- 00:07:52.593 [2024-07-14 02:55:47.748501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.748533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.748564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.748579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.593 #22 NEW cov: 11715 ft: 14605 corp: 21/66b lim: 5 exec/s: 22 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:52.593 [2024-07-14 02:55:47.798850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.798880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.798911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.798930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.798958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.798973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.799000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.799015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.593 [2024-07-14 02:55:47.799041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.593 [2024-07-14 02:55:47.799055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.854 #23 NEW cov: 11715 ft: 14622 corp: 22/71b lim: 5 exec/s: 23 rss: 69Mb L: 5/5 MS: 1 ChangeBit- 00:07:52.854 [2024-07-14 02:55:47.858767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:47.858797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.854 #24 NEW cov: 11715 ft: 14630 corp: 23/72b lim: 5 exec/s: 24 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:07:52.854 [2024-07-14 02:55:47.919155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:47.919186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:47.919217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:47.919232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:47.919260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:47.919275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:47.919302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:47.919317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:47.919344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:47.919358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.854 #25 NEW cov: 11715 ft: 14640 corp: 24/77b lim: 5 exec/s: 25 rss: 69Mb L: 5/5 MS: 1 InsertByte- 00:07:52.854 [2024-07-14 02:55:47.979131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:47.979160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:47.979191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:47.979210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.854 #26 NEW cov: 11715 ft: 14665 corp: 25/79b lim: 5 exec/s: 26 rss: 69Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:52.854 [2024-07-14 02:55:48.039456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.039488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:48.039519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.039535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:48.039562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.039578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:48.039606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.039620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:48.039647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.039661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.854 #27 NEW cov: 11715 ft: 14667 corp: 26/84b lim: 5 exec/s: 27 rss: 69Mb L: 5/5 MS: 1 ChangeBit- 00:07:52.854 [2024-07-14 02:55:48.089607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.089638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:48.089669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.089683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:48.089711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.089726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:48.089753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.089768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.854 [2024-07-14 02:55:48.089794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.854 [2024-07-14 02:55:48.089809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.114 #28 NEW cov: 11715 ft: 14691 corp: 27/89b lim: 5 exec/s: 28 rss: 69Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:53.114 [2024-07-14 02:55:48.159631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.159663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.114 [2024-07-14 02:55:48.159694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.159709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.114 #29 NEW cov: 11715 ft: 14710 corp: 28/91b lim: 5 exec/s: 29 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:53.114 [2024-07-14 02:55:48.209759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.209789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.114 [2024-07-14 02:55:48.209820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.209835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.114 #30 NEW cov: 11715 ft: 14732 corp: 29/93b lim: 5 exec/s: 30 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:07:53.114 [2024-07-14 02:55:48.260059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.260089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.114 [2024-07-14 02:55:48.260120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.260135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.114 [2024-07-14 02:55:48.260162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.260177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.114 [2024-07-14 02:55:48.260204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.260219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.114 [2024-07-14 02:55:48.260245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.260260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.114 #31 NEW cov: 11715 ft: 14737 corp: 30/98b lim: 5 exec/s: 31 rss: 69Mb L: 5/5 MS: 1 CrossOver- 00:07:53.114 [2024-07-14 02:55:48.320032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.320063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.114 [2024-07-14 02:55:48.320094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.114 [2024-07-14 02:55:48.320109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.422 #32 pulse cov: 11715 ft: 14781 corp: 30/98b lim: 5 exec/s: 16 rss: 70Mb 00:07:53.422 #32 NEW cov: 11715 ft: 14781 corp: 31/100b lim: 5 exec/s: 16 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:07:53.422 #32 DONE cov: 11715 ft: 14781 corp: 31/100b lim: 5 exec/s: 16 rss: 70Mb 00:07:53.422 Done 32 runs in 2 second(s) 00:07:53.422 02:55:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:53.422 02:55:48 -- ../common.sh@72 -- # (( i++ )) 00:07:53.422 02:55:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.422 02:55:48 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:53.422 02:55:48 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:53.422 02:55:48 -- nvmf/run.sh@24 -- # local timen=1 00:07:53.422 02:55:48 -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.422 02:55:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:53.423 02:55:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:53.423 02:55:48 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:53.423 02:55:48 -- nvmf/run.sh@29 -- # port=4410 00:07:53.423 02:55:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:53.423 02:55:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:53.423 02:55:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.423 02:55:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:53.423 [2024-07-14 02:55:48.518791] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:53.423 [2024-07-14 02:55:48.518895] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid671102 ] 00:07:53.423 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.687 [2024-07-14 02:55:48.701036] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.687 [2024-07-14 02:55:48.720450] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.687 [2024-07-14 02:55:48.720565] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.687 [2024-07-14 02:55:48.771881] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.687 [2024-07-14 02:55:48.788181] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:53.687 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.687 INFO: Seed: 1692207841 00:07:53.687 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:53.687 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:53.687 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:53.687 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.687 #2 INITED exec/s: 0 rss: 59Mb 00:07:53.687 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.687 This may also happen if the target rejected all inputs we tried so far 00:07:53.687 [2024-07-14 02:55:48.833406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.687 [2024-07-14 02:55:48.833434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.687 [2024-07-14 02:55:48.833498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.687 [2024-07-14 02:55:48.833512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.947 NEW_FUNC[1/670]: 0x49e620 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:53.947 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.947 #12 NEW cov: 11510 ft: 11512 corp: 2/22b lim: 40 exec/s: 0 rss: 66Mb L: 21/21 MS: 5 ShuffleBytes-CrossOver-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:07:53.947 [2024-07-14 02:55:49.144265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.947 [2024-07-14 02:55:49.144297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.947 [2024-07-14 02:55:49.144354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.947 [2024-07-14 02:55:49.144369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.947 #13 NEW cov: 11624 ft: 12222 corp: 3/43b lim: 40 exec/s: 0 rss: 66Mb L: 21/21 MS: 1 ChangeBit- 00:07:53.947 [2024-07-14 02:55:49.184421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.947 [2024-07-14 02:55:49.184454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.947 [2024-07-14 02:55:49.184515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.947 [2024-07-14 02:55:49.184530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.947 [2024-07-14 02:55:49.184589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8767676 cdw11:76767227 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.947 [2024-07-14 02:55:49.184604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.206 #14 NEW cov: 11630 ft: 12605 corp: 4/68b lim: 40 exec/s: 0 rss: 66Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:54.206 [2024-07-14 02:55:49.224364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767674 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.224391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.206 [2024-07-14 02:55:49.224456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.224471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.206 #15 NEW cov: 11715 ft: 12897 corp: 5/89b lim: 40 exec/s: 0 rss: 66Mb L: 21/25 MS: 1 ChangeBit- 00:07:54.206 [2024-07-14 02:55:49.264631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.264658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.206 [2024-07-14 02:55:49.264719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:7cb8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.264733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.206 [2024-07-14 02:55:49.264790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8767676 cdw11:76767227 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.264807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.206 #16 NEW cov: 11715 ft: 12991 corp: 6/114b lim: 40 exec/s: 0 rss: 66Mb L: 25/25 MS: 1 ChangeBinInt- 00:07:54.206 [2024-07-14 02:55:49.304705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.304731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.206 [2024-07-14 02:55:49.304791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a767676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.304805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.206 [2024-07-14 02:55:49.304861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8b87676 cdw11:76767672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.304876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.206 #17 NEW cov: 11715 ft: 13135 corp: 7/140b lim: 40 exec/s: 0 rss: 66Mb L: 26/26 MS: 1 CrossOver- 00:07:54.206 [2024-07-14 02:55:49.344980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.345005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.206 [2024-07-14 02:55:49.345065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.345080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.206 [2024-07-14 02:55:49.345138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000076 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.345152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.206 [2024-07-14 02:55:49.345212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.206 [2024-07-14 02:55:49.345226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.206 #18 NEW cov: 11715 ft: 13625 corp: 8/179b lim: 40 exec/s: 0 rss: 66Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:54.207 [2024-07-14 02:55:49.384851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.207 [2024-07-14 02:55:49.384877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.207 [2024-07-14 02:55:49.384938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.207 [2024-07-14 02:55:49.384952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.207 #19 NEW cov: 11715 ft: 13660 corp: 9/200b lim: 40 exec/s: 0 rss: 67Mb L: 21/39 MS: 1 CrossOver- 00:07:54.207 [2024-07-14 02:55:49.414900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767674 cdw11:760a7676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.207 [2024-07-14 02:55:49.414926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.207 [2024-07-14 02:55:49.414990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.207 [2024-07-14 02:55:49.415004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.207 #20 NEW cov: 11715 ft: 13745 corp: 10/219b lim: 40 exec/s: 0 rss: 67Mb L: 19/39 MS: 1 CrossOver- 00:07:54.207 [2024-07-14 02:55:49.455363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.207 [2024-07-14 02:55:49.455390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.207 [2024-07-14 02:55:49.455452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.207 [2024-07-14 02:55:49.455467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.207 [2024-07-14 02:55:49.455536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000076 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.207 [2024-07-14 02:55:49.455550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.207 [2024-07-14 02:55:49.455606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.207 [2024-07-14 02:55:49.455620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.466 #26 NEW cov: 11715 ft: 13793 corp: 11/258b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBit- 00:07:54.466 [2024-07-14 02:55:49.495187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767674 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.495214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.466 [2024-07-14 02:55:49.495272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76237676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.495287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.466 #27 NEW cov: 11715 ft: 13861 corp: 12/279b lim: 40 exec/s: 0 rss: 67Mb L: 21/39 MS: 1 ChangeByte- 00:07:54.466 [2024-07-14 02:55:49.535291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767674 cdw11:760a7676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.535318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.466 [2024-07-14 02:55:49.535376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.535391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.466 #28 NEW cov: 11715 ft: 13913 corp: 13/298b lim: 40 exec/s: 0 rss: 67Mb L: 19/39 MS: 1 ShuffleBytes- 00:07:54.466 [2024-07-14 02:55:49.575402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.575429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.466 [2024-07-14 02:55:49.575494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.575511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.466 #29 NEW cov: 11715 ft: 13951 corp: 14/320b lim: 40 exec/s: 0 rss: 67Mb L: 22/39 MS: 1 InsertByte- 00:07:54.466 [2024-07-14 02:55:49.615526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767674 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.615552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.466 [2024-07-14 02:55:49.615609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.615623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.466 #30 NEW cov: 11715 ft: 13983 corp: 15/341b lim: 40 exec/s: 0 rss: 67Mb L: 21/39 MS: 1 ShuffleBytes- 00:07:54.466 [2024-07-14 02:55:49.645742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.645768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.466 [2024-07-14 02:55:49.645824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a767676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.645838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.466 [2024-07-14 02:55:49.645897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8b87676 cdw11:8a877672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.645911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.466 #31 NEW cov: 11715 ft: 14038 corp: 16/367b lim: 40 exec/s: 0 rss: 67Mb L: 26/39 MS: 1 ChangeBinInt- 00:07:54.466 [2024-07-14 02:55:49.685609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.466 [2024-07-14 02:55:49.685635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.466 #36 NEW cov: 11715 ft: 14393 corp: 17/382b lim: 40 exec/s: 0 rss: 67Mb L: 15/39 MS: 5 ChangeByte-ChangeBit-CMP-EraseBytes-InsertRepeatedBytes- DE: "\377\377\377\377"- 00:07:54.726 [2024-07-14 02:55:49.725989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.726017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.726077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a767676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.726091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.726147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8987676 cdw11:76767672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.726161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.726 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.726 #37 NEW cov: 11738 ft: 14438 corp: 18/408b lim: 40 exec/s: 0 rss: 67Mb L: 26/39 MS: 1 ChangeBit- 00:07:54.726 [2024-07-14 02:55:49.766074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.766103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.766164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.766179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.766234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:76767227 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.766248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.726 #38 NEW cov: 11738 ft: 14517 corp: 19/433b lim: 40 exec/s: 0 rss: 67Mb L: 25/39 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:54.726 [2024-07-14 02:55:49.806065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.806092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.806154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b8b87676 cdw11:76767672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.806168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.726 #39 NEW cov: 11738 ft: 14527 corp: 20/451b lim: 40 exec/s: 39 rss: 67Mb L: 18/39 MS: 1 EraseBytes- 00:07:54.726 [2024-07-14 02:55:49.846586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.846612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.846673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.846687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.846744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000076 cdw11:7676762c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.846758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.846816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.846829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.846890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:76767676 cdw11:7676270a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.846904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.726 #40 NEW cov: 11738 ft: 14574 corp: 21/491b lim: 40 exec/s: 40 rss: 67Mb L: 40/40 MS: 1 InsertByte- 00:07:54.726 [2024-07-14 02:55:49.886641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.886668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.886733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:7cb8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.886748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.886807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.886821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.886881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:46464646 cdw11:464646b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.886895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.886952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:76767676 cdw11:7672270a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.886967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.726 #41 NEW cov: 11738 ft: 14592 corp: 22/531b lim: 40 exec/s: 41 rss: 67Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:54.726 [2024-07-14 02:55:49.926339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.926365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.926425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.926440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.726 #42 NEW cov: 11738 ft: 14602 corp: 23/552b lim: 40 exec/s: 42 rss: 67Mb L: 21/40 MS: 1 ShuffleBytes- 00:07:54.726 [2024-07-14 02:55:49.956568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.956593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.726 [2024-07-14 02:55:49.956652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:0000ff82 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.726 [2024-07-14 02:55:49.956666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.986 #43 NEW cov: 11738 ft: 14606 corp: 24/568b lim: 40 exec/s: 43 rss: 67Mb L: 16/40 MS: 1 InsertByte- 00:07:54.986 [2024-07-14 02:55:49.996666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a567676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:49.996692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:49.996753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:49.996768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.986 #44 NEW cov: 11738 ft: 14613 corp: 25/589b lim: 40 exec/s: 44 rss: 67Mb L: 21/40 MS: 1 ChangeBit- 00:07:54.986 [2024-07-14 02:55:50.026752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:23767674 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.026782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:50.026845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76237676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.026860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.986 #45 NEW cov: 11738 ft: 14624 corp: 26/610b lim: 40 exec/s: 45 rss: 68Mb L: 21/40 MS: 1 ChangeByte- 00:07:54.986 [2024-07-14 02:55:50.066870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.066900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:50.066960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.066975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.986 #46 NEW cov: 11738 ft: 14629 corp: 27/631b lim: 40 exec/s: 46 rss: 68Mb L: 21/40 MS: 1 ShuffleBytes- 00:07:54.986 [2024-07-14 02:55:50.097074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:41767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.097102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:50.097163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.097178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:50.097235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8b87676 cdw11:76767672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.097248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.986 #47 NEW cov: 11738 ft: 14666 corp: 28/657b lim: 40 exec/s: 47 rss: 68Mb L: 26/40 MS: 1 InsertByte- 00:07:54.986 [2024-07-14 02:55:50.137214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76760a76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.137241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:50.137301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.137315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:50.137375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8767676 cdw11:76767227 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.137389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.986 #48 NEW cov: 11738 ft: 14676 corp: 29/682b lim: 40 exec/s: 48 rss: 68Mb L: 25/40 MS: 1 ShuffleBytes- 00:07:54.986 [2024-07-14 02:55:50.167160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767674 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.167187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:50.167250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.167265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.986 #49 NEW cov: 11738 ft: 14714 corp: 30/700b lim: 40 exec/s: 49 rss: 68Mb L: 18/40 MS: 1 EraseBytes- 00:07:54.986 [2024-07-14 02:55:50.197255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0afe7676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.197281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.986 [2024-07-14 02:55:50.197343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.197357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.986 #50 NEW cov: 11738 ft: 14724 corp: 31/722b lim: 40 exec/s: 50 rss: 68Mb L: 22/40 MS: 1 InsertByte- 00:07:54.986 [2024-07-14 02:55:50.237303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.986 [2024-07-14 02:55:50.237330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.245 #51 NEW cov: 11738 ft: 14798 corp: 32/737b lim: 40 exec/s: 51 rss: 68Mb L: 15/40 MS: 1 ChangeByte- 00:07:55.245 [2024-07-14 02:55:50.277928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.277954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.245 [2024-07-14 02:55:50.278016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.278031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.245 [2024-07-14 02:55:50.278089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000076 cdw11:7676762c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.278104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.245 [2024-07-14 02:55:50.278165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.278178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.245 [2024-07-14 02:55:50.278236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:76767676 cdw11:7676000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.278261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.245 #52 NEW cov: 11738 ft: 14815 corp: 33/777b lim: 40 exec/s: 52 rss: 68Mb L: 40/40 MS: 1 ChangeByte- 00:07:55.245 [2024-07-14 02:55:50.317741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76760000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.317766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.245 [2024-07-14 02:55:50.317829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:001a7676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.317846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.245 [2024-07-14 02:55:50.317904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8987676 cdw11:76767672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.317917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.245 #53 NEW cov: 11738 ft: 14829 corp: 34/803b lim: 40 exec/s: 53 rss: 68Mb L: 26/40 MS: 1 ChangeBinInt- 00:07:55.245 [2024-07-14 02:55:50.357760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a567676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.357786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.245 [2024-07-14 02:55:50.357847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:765b7676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.357861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.245 #54 NEW cov: 11738 ft: 14868 corp: 35/824b lim: 40 exec/s: 54 rss: 68Mb L: 21/40 MS: 1 ChangeByte- 00:07:55.245 [2024-07-14 02:55:50.397849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.245 [2024-07-14 02:55:50.397874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.246 [2024-07-14 02:55:50.397933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.246 [2024-07-14 02:55:50.397948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.246 #55 NEW cov: 11738 ft: 14884 corp: 36/846b lim: 40 exec/s: 55 rss: 68Mb L: 22/40 MS: 1 InsertByte- 00:07:55.246 [2024-07-14 02:55:50.438007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.246 [2024-07-14 02:55:50.438032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.246 [2024-07-14 02:55:50.438091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:0000ff76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.246 [2024-07-14 02:55:50.438105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.246 #56 NEW cov: 11738 ft: 14902 corp: 37/862b lim: 40 exec/s: 56 rss: 68Mb L: 16/40 MS: 1 CrossOver- 00:07:55.246 [2024-07-14 02:55:50.478476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.246 [2024-07-14 02:55:50.478502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.246 [2024-07-14 02:55:50.478566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:7cb8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.246 [2024-07-14 02:55:50.478580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.246 [2024-07-14 02:55:50.478637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.246 [2024-07-14 02:55:50.478651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.246 [2024-07-14 02:55:50.478710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:46464646 cdw11:464646b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.246 [2024-07-14 02:55:50.478733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.246 [2024-07-14 02:55:50.478794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:76767676 cdw11:767227ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.246 [2024-07-14 02:55:50.478808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.505 #57 NEW cov: 11738 ft: 14909 corp: 38/902b lim: 40 exec/s: 57 rss: 68Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:55.505 [2024-07-14 02:55:50.518345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.518370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.518427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a767676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.518445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.518501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8b80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.518514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.505 #58 NEW cov: 11738 ft: 14922 corp: 39/928b lim: 40 exec/s: 58 rss: 68Mb L: 26/40 MS: 1 ChangeBinInt- 00:07:55.505 [2024-07-14 02:55:50.558437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.558467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.558526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:7cb8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.558540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.558597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8767676 cdw11:76767027 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.558610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.505 #59 NEW cov: 11738 ft: 14926 corp: 40/953b lim: 40 exec/s: 59 rss: 68Mb L: 25/40 MS: 1 ChangeBit- 00:07:55.505 [2024-07-14 02:55:50.598598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.598624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.598681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a767676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.598695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.598755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8b87676 cdw11:76767672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.598768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.505 #60 NEW cov: 11738 ft: 14944 corp: 41/979b lim: 40 exec/s: 60 rss: 68Mb L: 26/40 MS: 1 ShuffleBytes- 00:07:55.505 [2024-07-14 02:55:50.628393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a760a00 cdw11:00767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.628419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.505 #61 NEW cov: 11738 ft: 14950 corp: 42/991b lim: 40 exec/s: 61 rss: 68Mb L: 12/40 MS: 1 CrossOver- 00:07:55.505 [2024-07-14 02:55:50.668762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a760a00 cdw11:00767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.668787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.668842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.668856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.668912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:76767227 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.668926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.505 #62 NEW cov: 11738 ft: 14956 corp: 43/1020b lim: 40 exec/s: 62 rss: 69Mb L: 29/40 MS: 1 CrossOver- 00:07:55.505 [2024-07-14 02:55:50.708882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.708908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.708966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.708980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.505 [2024-07-14 02:55:50.709037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:7cb8b8b8 cdw11:b8767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.505 [2024-07-14 02:55:50.709051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.505 #63 NEW cov: 11738 ft: 14964 corp: 44/1049b lim: 40 exec/s: 63 rss: 69Mb L: 29/40 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:55.505 [2024-07-14 02:55:50.748902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.506 [2024-07-14 02:55:50.748928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.506 [2024-07-14 02:55:50.748986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:7676270a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.506 [2024-07-14 02:55:50.748999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.763 #64 NEW cov: 11738 ft: 14977 corp: 45/1065b lim: 40 exec/s: 64 rss: 69Mb L: 16/40 MS: 1 EraseBytes- 00:07:55.763 [2024-07-14 02:55:50.789365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.763 [2024-07-14 02:55:50.789391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.763 [2024-07-14 02:55:50.789440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:7cb8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.763 [2024-07-14 02:55:50.789459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.763 [2024-07-14 02:55:50.789502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.763 [2024-07-14 02:55:50.789516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.763 [2024-07-14 02:55:50.789572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:46464646 cdw11:464646b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.764 [2024-07-14 02:55:50.789586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.764 [2024-07-14 02:55:50.789641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:76767676 cdw11:767227ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.764 [2024-07-14 02:55:50.789655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.764 #65 NEW cov: 11738 ft: 14989 corp: 46/1105b lim: 40 exec/s: 65 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:55.764 [2024-07-14 02:55:50.829243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a767674 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.764 [2024-07-14 02:55:50.829270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.764 [2024-07-14 02:55:50.829327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a767676 cdw11:7676b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.764 [2024-07-14 02:55:50.829341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.764 [2024-07-14 02:55:50.829395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b8987676 cdw11:76767672 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.764 [2024-07-14 02:55:50.829409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.764 #66 NEW cov: 11738 ft: 15006 corp: 47/1131b lim: 40 exec/s: 33 rss: 69Mb L: 26/40 MS: 1 ChangeBit- 00:07:55.764 #66 DONE cov: 11738 ft: 15006 corp: 47/1131b lim: 40 exec/s: 33 rss: 69Mb 00:07:55.764 ###### Recommended dictionary. ###### 00:07:55.764 "\377\377\377\377" # Uses: 2 00:07:55.764 ###### End of recommended dictionary. ###### 00:07:55.764 Done 66 runs in 2 second(s) 00:07:55.764 02:55:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:55.764 02:55:50 -- ../common.sh@72 -- # (( i++ )) 00:07:55.764 02:55:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.764 02:55:50 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:55.764 02:55:50 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:55.764 02:55:50 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.764 02:55:50 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.764 02:55:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:55.764 02:55:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:55.764 02:55:50 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:55.764 02:55:50 -- nvmf/run.sh@29 -- # port=4411 00:07:55.764 02:55:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:55.764 02:55:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:55.764 02:55:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.764 02:55:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:55.764 [2024-07-14 02:55:51.001516] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:55.764 [2024-07-14 02:55:51.001584] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid671411 ] 00:07:56.022 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.022 [2024-07-14 02:55:51.175547] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.022 [2024-07-14 02:55:51.195192] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:56.022 [2024-07-14 02:55:51.195307] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.022 [2024-07-14 02:55:51.246649] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.022 [2024-07-14 02:55:51.262942] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:56.280 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.280 INFO: Seed: 4166218433 00:07:56.280 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:56.280 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:56.280 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:56.280 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.280 #2 INITED exec/s: 0 rss: 59Mb 00:07:56.280 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.280 This may also happen if the target rejected all inputs we tried so far 00:07:56.280 [2024-07-14 02:55:51.328899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.281 [2024-07-14 02:55:51.328936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.540 NEW_FUNC[1/671]: 0x4a0390 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:56.540 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:56.540 #5 NEW cov: 11523 ft: 11518 corp: 2/11b lim: 40 exec/s: 0 rss: 66Mb L: 10/10 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:56.540 [2024-07-14 02:55:51.670554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.670592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.540 [2024-07-14 02:55:51.670718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.670734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.540 [2024-07-14 02:55:51.670871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.670887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.540 #6 NEW cov: 11636 ft: 12858 corp: 3/40b lim: 40 exec/s: 0 rss: 66Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:56.540 [2024-07-14 02:55:51.720569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.720599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.540 [2024-07-14 02:55:51.720741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.720760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.540 [2024-07-14 02:55:51.720900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.720916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.540 #7 NEW cov: 11642 ft: 13058 corp: 4/69b lim: 40 exec/s: 0 rss: 66Mb L: 29/29 MS: 1 CrossOver- 00:07:56.540 [2024-07-14 02:55:51.780810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.780839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.540 [2024-07-14 02:55:51.780973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.780989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.540 [2024-07-14 02:55:51.781130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.540 [2024-07-14 02:55:51.781148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.800 #8 NEW cov: 11727 ft: 13318 corp: 5/98b lim: 40 exec/s: 0 rss: 66Mb L: 29/29 MS: 1 ChangeBinInt- 00:07:56.800 [2024-07-14 02:55:51.840399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.800 [2024-07-14 02:55:51.840427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.800 #9 NEW cov: 11727 ft: 13419 corp: 6/108b lim: 40 exec/s: 0 rss: 66Mb L: 10/29 MS: 1 ChangeBinInt- 00:07:56.800 [2024-07-14 02:55:51.900567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0c000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.800 [2024-07-14 02:55:51.900593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.800 #12 NEW cov: 11727 ft: 13469 corp: 7/120b lim: 40 exec/s: 0 rss: 66Mb L: 12/29 MS: 3 InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:56.800 [2024-07-14 02:55:51.951286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.800 [2024-07-14 02:55:51.951312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.800 [2024-07-14 02:55:51.951480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.800 [2024-07-14 02:55:51.951497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.800 [2024-07-14 02:55:51.951653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.800 [2024-07-14 02:55:51.951672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.800 #13 NEW cov: 11727 ft: 13722 corp: 8/149b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CrossOver- 00:07:56.800 [2024-07-14 02:55:52.011215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.800 [2024-07-14 02:55:52.011245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.800 [2024-07-14 02:55:52.011380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.800 [2024-07-14 02:55:52.011397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.800 #14 NEW cov: 11727 ft: 13952 corp: 9/168b lim: 40 exec/s: 0 rss: 67Mb L: 19/29 MS: 1 CopyPart- 00:07:57.059 [2024-07-14 02:55:52.061088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.061118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.059 #15 NEW cov: 11727 ft: 14047 corp: 10/177b lim: 40 exec/s: 0 rss: 67Mb L: 9/29 MS: 1 CrossOver- 00:07:57.059 [2024-07-14 02:55:52.111778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.111806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.059 [2024-07-14 02:55:52.111967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.111982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.059 [2024-07-14 02:55:52.112133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f126 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.112150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.059 #16 NEW cov: 11727 ft: 14069 corp: 11/206b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeByte- 00:07:57.059 [2024-07-14 02:55:52.171377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.171405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.059 #17 NEW cov: 11727 ft: 14085 corp: 12/215b lim: 40 exec/s: 0 rss: 67Mb L: 9/29 MS: 1 EraseBytes- 00:07:57.059 [2024-07-14 02:55:52.222363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.222393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.059 [2024-07-14 02:55:52.222542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.222559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.059 [2024-07-14 02:55:52.222703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.222720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.059 [2024-07-14 02:55:52.222872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.059 [2024-07-14 02:55:52.222889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.059 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.059 #22 NEW cov: 11750 ft: 14444 corp: 13/248b lim: 40 exec/s: 0 rss: 67Mb L: 33/33 MS: 5 CopyPart-ChangeByte-CrossOver-EraseBytes-InsertRepeatedBytes- 00:07:57.060 [2024-07-14 02:55:52.272234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.060 [2024-07-14 02:55:52.272261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.060 [2024-07-14 02:55:52.272397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.060 [2024-07-14 02:55:52.272413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.060 [2024-07-14 02:55:52.272551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.060 [2024-07-14 02:55:52.272569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.060 #23 NEW cov: 11750 ft: 14485 corp: 14/277b lim: 40 exec/s: 0 rss: 67Mb L: 29/33 MS: 1 ShuffleBytes- 00:07:57.318 [2024-07-14 02:55:52.322771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.322799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.322940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.322958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.323100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0af1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.323116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.323242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000af1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.323257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.318 #24 NEW cov: 11750 ft: 14496 corp: 15/313b lim: 40 exec/s: 24 rss: 67Mb L: 36/36 MS: 1 CrossOver- 00:07:57.318 [2024-07-14 02:55:52.382401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.382427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.382570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.382587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.318 #25 NEW cov: 11750 ft: 14535 corp: 16/332b lim: 40 exec/s: 25 rss: 67Mb L: 19/36 MS: 1 ShuffleBytes- 00:07:57.318 [2024-07-14 02:55:52.433102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.433130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.433275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.433294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.433436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f1f1f1f1 cdw11:f1f10000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.433458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.433597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.433614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.318 #26 NEW cov: 11750 ft: 14581 corp: 17/370b lim: 40 exec/s: 26 rss: 67Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:57.318 [2024-07-14 02:55:52.482728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0c000000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.482755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.482906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.482932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.318 #27 NEW cov: 11750 ft: 14585 corp: 18/390b lim: 40 exec/s: 27 rss: 67Mb L: 20/38 MS: 1 InsertRepeatedBytes- 00:07:57.318 [2024-07-14 02:55:52.532872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00005500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.532898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.318 [2024-07-14 02:55:52.533029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.318 [2024-07-14 02:55:52.533046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.318 #28 NEW cov: 11750 ft: 14594 corp: 19/410b lim: 40 exec/s: 28 rss: 67Mb L: 20/38 MS: 1 InsertByte- 00:07:57.577 [2024-07-14 02:55:52.583113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.583140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.583271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.583287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.577 #29 NEW cov: 11750 ft: 14610 corp: 20/429b lim: 40 exec/s: 29 rss: 67Mb L: 19/38 MS: 1 ChangeBit- 00:07:57.577 [2024-07-14 02:55:52.633498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.633525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.633664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.633682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.633825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.633844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.577 #30 NEW cov: 11750 ft: 14626 corp: 21/458b lim: 40 exec/s: 30 rss: 67Mb L: 29/38 MS: 1 ShuffleBytes- 00:07:57.577 [2024-07-14 02:55:52.683718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.683746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.683873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.683892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.684024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f1f1f1f1 cdw11:bff1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.684041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.577 #31 NEW cov: 11750 ft: 14644 corp: 22/487b lim: 40 exec/s: 31 rss: 67Mb L: 29/38 MS: 1 ChangeByte- 00:07:57.577 [2024-07-14 02:55:52.734110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.734137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.734269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.734285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.734424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.734445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.734558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f1f10af1 cdw11:f1000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.734577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.577 #32 NEW cov: 11750 ft: 14653 corp: 23/525b lim: 40 exec/s: 32 rss: 67Mb L: 38/38 MS: 1 CrossOver- 00:07:57.577 [2024-07-14 02:55:52.793804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.793831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.577 [2024-07-14 02:55:52.793972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.577 [2024-07-14 02:55:52.793991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.577 #33 NEW cov: 11750 ft: 14663 corp: 24/542b lim: 40 exec/s: 33 rss: 68Mb L: 17/38 MS: 1 EraseBytes- 00:07:57.836 [2024-07-14 02:55:52.854893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.854922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.836 [2024-07-14 02:55:52.855063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.855081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.836 [2024-07-14 02:55:52.855214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.855231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.836 [2024-07-14 02:55:52.855362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f1ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.855379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.836 [2024-07-14 02:55:52.855513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:f1f1f10a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.855531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.836 #34 NEW cov: 11750 ft: 14741 corp: 25/582b lim: 40 exec/s: 34 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:57.836 [2024-07-14 02:55:52.913909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0c000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.913937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.836 #35 NEW cov: 11750 ft: 14746 corp: 26/595b lim: 40 exec/s: 35 rss: 68Mb L: 13/40 MS: 1 CrossOver- 00:07:57.836 [2024-07-14 02:55:52.964821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:0000f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.964848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.836 [2024-07-14 02:55:52.964998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:f1f1f1f1 cdw11:f1000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.965014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.836 [2024-07-14 02:55:52.965142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000af1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:52.965160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.836 #36 NEW cov: 11750 ft: 14757 corp: 27/624b lim: 40 exec/s: 36 rss: 68Mb L: 29/40 MS: 1 CrossOver- 00:07:57.836 [2024-07-14 02:55:53.014778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:00300000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:53.014805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.836 [2024-07-14 02:55:53.014954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:53.014971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.836 [2024-07-14 02:55:53.015123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f126 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:53.015143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.836 #37 NEW cov: 11750 ft: 14765 corp: 28/653b lim: 40 exec/s: 37 rss: 68Mb L: 29/40 MS: 1 ChangeByte- 00:07:57.836 [2024-07-14 02:55:53.074423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0c000000 cdw11:00000800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.836 [2024-07-14 02:55:53.074453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.097 #38 NEW cov: 11750 ft: 14782 corp: 29/666b lim: 40 exec/s: 38 rss: 68Mb L: 13/40 MS: 1 ChangeBit- 00:07:58.097 [2024-07-14 02:55:53.125728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.125754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.125888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.125906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.125991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.126007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.126142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f1ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.126158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.126297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:f1f1f10a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.126315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.097 #39 NEW cov: 11750 ft: 14916 corp: 30/706b lim: 40 exec/s: 39 rss: 68Mb L: 40/40 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:58.097 [2024-07-14 02:55:53.185015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.185041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.185175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.185191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.097 #40 NEW cov: 11750 ft: 14921 corp: 31/725b lim: 40 exec/s: 40 rss: 68Mb L: 19/40 MS: 1 ShuffleBytes- 00:07:58.097 [2024-07-14 02:55:53.236096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.236121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.236260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.236277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.236406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.236423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.236558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f1ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.236574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.236707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffff8e cdw11:f1f1f10a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.236722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.097 #41 NEW cov: 11750 ft: 14988 corp: 32/765b lim: 40 exec/s: 41 rss: 68Mb L: 40/40 MS: 1 ChangeByte- 00:07:58.097 [2024-07-14 02:55:53.286199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.286226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.286373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.286391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.286541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0af1b1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.286559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.286697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f1ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.286713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.097 [2024-07-14 02:55:53.286849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:f1f1f10a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.097 [2024-07-14 02:55:53.286865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.097 #42 NEW cov: 11750 ft: 15001 corp: 33/805b lim: 40 exec/s: 21 rss: 68Mb L: 40/40 MS: 1 ChangeBit- 00:07:58.097 #42 DONE cov: 11750 ft: 15001 corp: 33/805b lim: 40 exec/s: 21 rss: 68Mb 00:07:58.097 ###### Recommended dictionary. ###### 00:07:58.097 "\001\000\000\000" # Uses: 0 00:07:58.097 ###### End of recommended dictionary. ###### 00:07:58.097 Done 42 runs in 2 second(s) 00:07:58.356 02:55:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:58.356 02:55:53 -- ../common.sh@72 -- # (( i++ )) 00:07:58.356 02:55:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.356 02:55:53 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:58.356 02:55:53 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:58.356 02:55:53 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.356 02:55:53 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.356 02:55:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:58.356 02:55:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:58.356 02:55:53 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:58.356 02:55:53 -- nvmf/run.sh@29 -- # port=4412 00:07:58.356 02:55:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:58.356 02:55:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:58.356 02:55:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.356 02:55:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:58.356 [2024-07-14 02:55:53.468150] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:58.356 [2024-07-14 02:55:53.468219] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid671948 ] 00:07:58.356 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.615 [2024-07-14 02:55:53.640251] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.615 [2024-07-14 02:55:53.660035] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.615 [2024-07-14 02:55:53.660153] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.615 [2024-07-14 02:55:53.711565] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.615 [2024-07-14 02:55:53.727844] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:58.615 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.615 INFO: Seed: 2336245199 00:07:58.615 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:58.615 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:58.615 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:58.615 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.616 #2 INITED exec/s: 0 rss: 59Mb 00:07:58.616 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.616 This may also happen if the target rejected all inputs we tried so far 00:07:58.616 [2024-07-14 02:55:53.773450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.616 [2024-07-14 02:55:53.773479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.616 [2024-07-14 02:55:53.773537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.616 [2024-07-14 02:55:53.773552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.616 [2024-07-14 02:55:53.773607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.616 [2024-07-14 02:55:53.773621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.874 NEW_FUNC[1/671]: 0x4a2100 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:58.874 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.874 #6 NEW cov: 11520 ft: 11522 corp: 2/26b lim: 40 exec/s: 0 rss: 66Mb L: 25/25 MS: 4 ChangeBinInt-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:58.874 [2024-07-14 02:55:54.093791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.874 [2024-07-14 02:55:54.093824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.874 #10 NEW cov: 11634 ft: 12737 corp: 3/39b lim: 40 exec/s: 0 rss: 67Mb L: 13/25 MS: 4 ChangeBit-ChangeByte-ChangeBit-CrossOver- 00:07:59.133 [2024-07-14 02:55:54.133886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.133917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.133 #11 NEW cov: 11640 ft: 12960 corp: 4/52b lim: 40 exec/s: 0 rss: 67Mb L: 13/25 MS: 1 ShuffleBytes- 00:07:59.133 [2024-07-14 02:55:54.173931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.173958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.133 #12 NEW cov: 11725 ft: 13257 corp: 5/65b lim: 40 exec/s: 0 rss: 67Mb L: 13/25 MS: 1 ShuffleBytes- 00:07:59.133 [2024-07-14 02:55:54.214467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.214493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.214546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.214559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.214613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.214627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.214679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.214692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.133 #13 NEW cov: 11725 ft: 13669 corp: 6/104b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:59.133 [2024-07-14 02:55:54.254200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.254225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.133 #19 NEW cov: 11725 ft: 13784 corp: 7/117b lim: 40 exec/s: 0 rss: 67Mb L: 13/39 MS: 1 ChangeBit- 00:07:59.133 [2024-07-14 02:55:54.294727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.294753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.294806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.294820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.294874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2ccb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.294888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.294939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d3d3d32c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.294953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.133 #20 NEW cov: 11725 ft: 13855 corp: 8/156b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:59.133 [2024-07-14 02:55:54.334426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b0d cdw11:0000001b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.334458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.133 #21 NEW cov: 11725 ft: 13934 corp: 9/169b lim: 40 exec/s: 0 rss: 67Mb L: 13/39 MS: 1 ChangeBinInt- 00:07:59.133 [2024-07-14 02:55:54.375115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.375140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.375195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.375209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.375260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2ccb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.375274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.375323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d326d3d3 cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.375337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.133 [2024-07-14 02:55:54.375387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.133 [2024-07-14 02:55:54.375401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.392 #22 NEW cov: 11725 ft: 14037 corp: 10/209b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 InsertByte- 00:07:59.392 [2024-07-14 02:55:54.414625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1bea cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.414651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.392 #23 NEW cov: 11725 ft: 14177 corp: 11/222b lim: 40 exec/s: 0 rss: 67Mb L: 13/40 MS: 1 ChangeBinInt- 00:07:59.392 [2024-07-14 02:55:54.454741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e401b1b cdw11:ea1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.454766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.392 #24 NEW cov: 11725 ft: 14255 corp: 12/236b lim: 40 exec/s: 0 rss: 68Mb L: 14/40 MS: 1 InsertByte- 00:07:59.392 [2024-07-14 02:55:54.495295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.495320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.392 [2024-07-14 02:55:54.495375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.495390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.392 [2024-07-14 02:55:54.495452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c3c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.495469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.392 [2024-07-14 02:55:54.495522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.495535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.392 #25 NEW cov: 11725 ft: 14261 corp: 13/275b lim: 40 exec/s: 0 rss: 68Mb L: 39/40 MS: 1 ChangeBit- 00:07:59.392 [2024-07-14 02:55:54.534964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.534990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.392 #26 NEW cov: 11725 ft: 14302 corp: 14/288b lim: 40 exec/s: 0 rss: 68Mb L: 13/40 MS: 1 ShuffleBytes- 00:07:59.392 [2024-07-14 02:55:54.575104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.575129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.392 #27 NEW cov: 11725 ft: 14332 corp: 15/301b lim: 40 exec/s: 0 rss: 68Mb L: 13/40 MS: 1 ChangeBinInt- 00:07:59.392 [2024-07-14 02:55:54.615201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.392 [2024-07-14 02:55:54.615227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.392 #28 NEW cov: 11725 ft: 14403 corp: 16/314b lim: 40 exec/s: 0 rss: 68Mb L: 13/40 MS: 1 ShuffleBytes- 00:07:59.651 [2024-07-14 02:55:54.655867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.655894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.655947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.655961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.656012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2ccb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.656026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.656077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d3d3d32c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.656090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.656142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.656157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.651 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.651 #29 NEW cov: 11748 ft: 14436 corp: 17/354b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:07:59.651 [2024-07-14 02:55:54.695546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.695574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.695628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.695642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 #30 NEW cov: 11748 ft: 14644 corp: 18/375b lim: 40 exec/s: 0 rss: 68Mb L: 21/40 MS: 1 CopyPart- 00:07:59.651 [2024-07-14 02:55:54.735709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:4a4a4a4a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.735734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.735788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:4a4a4a4a cdw11:4a4a1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.735802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 #31 NEW cov: 11748 ft: 14666 corp: 19/398b lim: 40 exec/s: 31 rss: 68Mb L: 23/40 MS: 1 InsertRepeatedBytes- 00:07:59.651 [2024-07-14 02:55:54.775955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.775980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.776034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.776049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.776100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:1bf91b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.776114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.651 #32 NEW cov: 11748 ft: 14680 corp: 20/423b lim: 40 exec/s: 32 rss: 68Mb L: 25/40 MS: 1 ChangeByte- 00:07:59.651 [2024-07-14 02:55:54.816367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.816391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.816446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.816460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.816512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2ccb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.816526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.816579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d3d3d32c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.816592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.816643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.816659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.651 #33 NEW cov: 11748 ft: 14731 corp: 21/463b lim: 40 exec/s: 33 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:07:59.651 [2024-07-14 02:55:54.856476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.856501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.856554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.856568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-07-14 02:55:54.856622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.651 [2024-07-14 02:55:54.856635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.652 [2024-07-14 02:55:54.856689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2ccbd3d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.652 [2024-07-14 02:55:54.856703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.652 [2024-07-14 02:55:54.856754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:d32c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.652 [2024-07-14 02:55:54.856767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.652 #34 NEW cov: 11748 ft: 14741 corp: 22/503b lim: 40 exec/s: 34 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:07:59.652 [2024-07-14 02:55:54.896315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.652 [2024-07-14 02:55:54.896340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.652 [2024-07-14 02:55:54.896395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.652 [2024-07-14 02:55:54.896409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.652 [2024-07-14 02:55:54.896463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:1bf91be3 cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.652 [2024-07-14 02:55:54.896477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.911 #35 NEW cov: 11748 ft: 14747 corp: 23/528b lim: 40 exec/s: 35 rss: 69Mb L: 25/40 MS: 1 ChangeByte- 00:07:59.911 [2024-07-14 02:55:54.936134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b6e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:54.936160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.911 #36 NEW cov: 11748 ft: 14769 corp: 24/543b lim: 40 exec/s: 36 rss: 69Mb L: 15/40 MS: 1 CrossOver- 00:07:59.911 [2024-07-14 02:55:54.976240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:54.976265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.911 #37 NEW cov: 11748 ft: 14776 corp: 25/557b lim: 40 exec/s: 37 rss: 69Mb L: 14/40 MS: 1 InsertByte- 00:07:59.911 [2024-07-14 02:55:55.016345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:55.016369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.911 #38 NEW cov: 11748 ft: 14788 corp: 26/565b lim: 40 exec/s: 38 rss: 69Mb L: 8/40 MS: 1 EraseBytes- 00:07:59.911 [2024-07-14 02:55:55.056473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b031b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:55.056497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.911 #39 NEW cov: 11748 ft: 14817 corp: 27/576b lim: 40 exec/s: 39 rss: 69Mb L: 11/40 MS: 1 EraseBytes- 00:07:59.911 [2024-07-14 02:55:55.097019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:55.097044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.911 [2024-07-14 02:55:55.097097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1bffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:55.097111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.911 [2024-07-14 02:55:55.097165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:55.097179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.911 [2024-07-14 02:55:55.097231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:55.097245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.911 #40 NEW cov: 11748 ft: 14826 corp: 28/610b lim: 40 exec/s: 40 rss: 69Mb L: 34/40 MS: 1 InsertRepeatedBytes- 00:07:59.911 [2024-07-14 02:55:55.136715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e6e1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.911 [2024-07-14 02:55:55.136741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.170 #41 NEW cov: 11748 ft: 14834 corp: 29/625b lim: 40 exec/s: 41 rss: 69Mb L: 15/40 MS: 1 ShuffleBytes- 00:08:00.170 [2024-07-14 02:55:55.176815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:0b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.176841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.170 #42 NEW cov: 11748 ft: 14845 corp: 30/638b lim: 40 exec/s: 42 rss: 69Mb L: 13/40 MS: 1 ChangeBit- 00:08:00.170 [2024-07-14 02:55:55.217364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.217390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.217448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:3c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.217462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.217519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.217533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.217586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.217600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.170 #43 NEW cov: 11748 ft: 14860 corp: 31/671b lim: 40 exec/s: 43 rss: 69Mb L: 33/40 MS: 1 EraseBytes- 00:08:00.170 [2024-07-14 02:55:55.257499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.257524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.257580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1bffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.257594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.257645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.257659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.257713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:22ffffff cdw11:ffffff1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.257726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.170 #44 NEW cov: 11748 ft: 14874 corp: 32/705b lim: 40 exec/s: 44 rss: 69Mb L: 34/40 MS: 1 ChangeBinInt- 00:08:00.170 [2024-07-14 02:55:55.297618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.297643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.297698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1bffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.297712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.297763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.297778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.170 [2024-07-14 02:55:55.297828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:22ffffff cdw11:ff74ff1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.297841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.170 #45 NEW cov: 11748 ft: 14878 corp: 33/739b lim: 40 exec/s: 45 rss: 70Mb L: 34/40 MS: 1 ChangeByte- 00:08:00.170 [2024-07-14 02:55:55.337286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1bea cdw11:5b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.337311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.170 #46 NEW cov: 11748 ft: 14887 corp: 34/752b lim: 40 exec/s: 46 rss: 70Mb L: 13/40 MS: 1 ChangeBit- 00:08:00.170 [2024-07-14 02:55:55.377409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1bea cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.377434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.170 #47 NEW cov: 11748 ft: 14924 corp: 35/763b lim: 40 exec/s: 47 rss: 70Mb L: 11/40 MS: 1 EraseBytes- 00:08:00.170 [2024-07-14 02:55:55.418012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c6c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.170 [2024-07-14 02:55:55.418038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.171 [2024-07-14 02:55:55.418094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:3c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.171 [2024-07-14 02:55:55.418109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.171 [2024-07-14 02:55:55.418162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.171 [2024-07-14 02:55:55.418175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.171 [2024-07-14 02:55:55.418229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.171 [2024-07-14 02:55:55.418242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.429 #48 NEW cov: 11748 ft: 14932 corp: 36/796b lim: 40 exec/s: 48 rss: 70Mb L: 33/40 MS: 1 ChangeBit- 00:08:00.429 [2024-07-14 02:55:55.457677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.457702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.429 #49 NEW cov: 11748 ft: 14945 corp: 37/811b lim: 40 exec/s: 49 rss: 70Mb L: 15/40 MS: 1 CrossOver- 00:08:00.429 [2024-07-14 02:55:55.497942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b4d1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.497968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.429 [2024-07-14 02:55:55.498024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.498038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.429 #50 NEW cov: 11748 ft: 14959 corp: 38/827b lim: 40 exec/s: 50 rss: 70Mb L: 16/40 MS: 1 InsertByte- 00:08:00.429 [2024-07-14 02:55:55.538067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.538092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.429 [2024-07-14 02:55:55.538145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.538159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.429 #51 NEW cov: 11748 ft: 14966 corp: 39/846b lim: 40 exec/s: 51 rss: 70Mb L: 19/40 MS: 1 CrossOver- 00:08:00.429 [2024-07-14 02:55:55.578484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.578509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.429 [2024-07-14 02:55:55.578563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.578576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.429 [2024-07-14 02:55:55.578630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c3c2ccb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.578644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.429 [2024-07-14 02:55:55.578695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d3d3d32c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.578708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.429 #57 NEW cov: 11748 ft: 14996 corp: 40/885b lim: 40 exec/s: 57 rss: 70Mb L: 39/40 MS: 1 ChangeBit- 00:08:00.429 [2024-07-14 02:55:55.618438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.618466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.429 [2024-07-14 02:55:55.618519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:cbd326d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.429 [2024-07-14 02:55:55.618533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.430 [2024-07-14 02:55:55.618587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d32c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.430 [2024-07-14 02:55:55.618601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.430 #58 NEW cov: 11748 ft: 15033 corp: 41/914b lim: 40 exec/s: 58 rss: 70Mb L: 29/40 MS: 1 EraseBytes- 00:08:00.430 [2024-07-14 02:55:55.658434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.430 [2024-07-14 02:55:55.658463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.430 [2024-07-14 02:55:55.658518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1b1b1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.430 [2024-07-14 02:55:55.658532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.689 #59 NEW cov: 11748 ft: 15052 corp: 42/936b lim: 40 exec/s: 59 rss: 70Mb L: 22/40 MS: 1 InsertByte- 00:08:00.689 [2024-07-14 02:55:55.698578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:6e1b1b1b cdw11:4aff4a4a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.689 [2024-07-14 02:55:55.698604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.689 [2024-07-14 02:55:55.698660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:4a4a4a4a cdw11:4a4a1b1b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.689 [2024-07-14 02:55:55.698675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.689 #60 NEW cov: 11748 ft: 15064 corp: 43/959b lim: 40 exec/s: 60 rss: 70Mb L: 23/40 MS: 1 ChangeByte- 00:08:00.689 [2024-07-14 02:55:55.738815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.689 [2024-07-14 02:55:55.738841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.689 [2024-07-14 02:55:55.738897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:cbd326d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.689 [2024-07-14 02:55:55.738911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.689 [2024-07-14 02:55:55.738963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d32c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.689 [2024-07-14 02:55:55.738976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.689 #61 NEW cov: 11748 ft: 15067 corp: 44/988b lim: 40 exec/s: 30 rss: 70Mb L: 29/40 MS: 1 ChangeByte- 00:08:00.689 #61 DONE cov: 11748 ft: 15067 corp: 44/988b lim: 40 exec/s: 30 rss: 70Mb 00:08:00.689 Done 61 runs in 2 second(s) 00:08:00.689 02:55:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:00.689 02:55:55 -- ../common.sh@72 -- # (( i++ )) 00:08:00.689 02:55:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.689 02:55:55 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:00.689 02:55:55 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:00.689 02:55:55 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.689 02:55:55 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.689 02:55:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:00.689 02:55:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:00.689 02:55:55 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:00.689 02:55:55 -- nvmf/run.sh@29 -- # port=4413 00:08:00.689 02:55:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:00.689 02:55:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:00.689 02:55:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.689 02:55:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:00.689 [2024-07-14 02:55:55.916582] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:00.689 [2024-07-14 02:55:55.916660] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid672341 ] 00:08:00.947 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.947 [2024-07-14 02:55:56.091138] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.947 [2024-07-14 02:55:56.110925] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.947 [2024-07-14 02:55:56.111042] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.948 [2024-07-14 02:55:56.162373] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.948 [2024-07-14 02:55:56.178667] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:00.948 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.948 INFO: Seed: 491270440 00:08:01.206 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:01.206 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:01.206 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:01.206 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.206 #2 INITED exec/s: 0 rss: 59Mb 00:08:01.206 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.206 This may also happen if the target rejected all inputs we tried so far 00:08:01.206 [2024-07-14 02:55:56.248760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.206 [2024-07-14 02:55:56.248799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.206 [2024-07-14 02:55:56.248939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.206 [2024-07-14 02:55:56.248962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.466 NEW_FUNC[1/670]: 0x4a3cc0 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:01.466 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.466 #6 NEW cov: 11509 ft: 11510 corp: 2/23b lim: 40 exec/s: 0 rss: 66Mb L: 22/22 MS: 4 ChangeByte-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:01.466 [2024-07-14 02:55:56.569636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:90808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.569678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.466 [2024-07-14 02:55:56.569825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.569848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.466 #7 NEW cov: 11622 ft: 12010 corp: 3/45b lim: 40 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 ChangeBit- 00:08:01.466 [2024-07-14 02:55:56.619668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.619700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.466 [2024-07-14 02:55:56.619835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.619857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.466 #8 NEW cov: 11628 ft: 12156 corp: 4/67b lim: 40 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 ShuffleBytes- 00:08:01.466 [2024-07-14 02:55:56.660001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.660030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.466 [2024-07-14 02:55:56.660167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.660188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.466 [2024-07-14 02:55:56.660328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed9d9a cdw11:29dbaa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.660346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.466 #9 NEW cov: 11713 ft: 12568 corp: 5/97b lim: 40 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 CMP- DE: "\001)\355\235\232)\333\252"- 00:08:01.466 [2024-07-14 02:55:56.700070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.700099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.466 [2024-07-14 02:55:56.700231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.700251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.466 [2024-07-14 02:55:56.700392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed9d9a cdw11:29dbaa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.466 [2024-07-14 02:55:56.700412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.725 #10 NEW cov: 11713 ft: 12749 corp: 6/127b lim: 40 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 ChangeBit- 00:08:01.725 [2024-07-14 02:55:56.740480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.740508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.740643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:8080ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.740663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.740799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.740819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.740965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.740984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.725 #11 NEW cov: 11713 ft: 13328 corp: 7/166b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:01.725 [2024-07-14 02:55:56.780550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.780577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.780710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:8080ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.780731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.780870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.780889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.781012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.781037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.725 #12 NEW cov: 11713 ft: 13464 corp: 8/205b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 CopyPart- 00:08:01.725 [2024-07-14 02:55:56.820246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.820274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.820411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.820431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.725 #13 NEW cov: 11713 ft: 13490 corp: 9/227b lim: 40 exec/s: 0 rss: 68Mb L: 22/39 MS: 1 ShuffleBytes- 00:08:01.725 [2024-07-14 02:55:56.860404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.860431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.860572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:90808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.860592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.725 #14 NEW cov: 11713 ft: 13532 corp: 10/249b lim: 40 exec/s: 0 rss: 68Mb L: 22/39 MS: 1 CrossOver- 00:08:01.725 [2024-07-14 02:55:56.900750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.900778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.900912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.900934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.901080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed9d9a cdw11:29dbaa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.901101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.725 #15 NEW cov: 11713 ft: 13556 corp: 11/279b lim: 40 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 CrossOver- 00:08:01.725 [2024-07-14 02:55:56.940656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.940684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.725 [2024-07-14 02:55:56.940829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.725 [2024-07-14 02:55:56.940854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.725 #16 NEW cov: 11713 ft: 13658 corp: 12/301b lim: 40 exec/s: 0 rss: 68Mb L: 22/39 MS: 1 ShuffleBytes- 00:08:01.985 [2024-07-14 02:55:56.980813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808059 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:56.980842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:56.980979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:56.981001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.985 #17 NEW cov: 11713 ft: 13723 corp: 13/324b lim: 40 exec/s: 0 rss: 68Mb L: 23/39 MS: 1 InsertByte- 00:08:01.985 [2024-07-14 02:55:57.020912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808059 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.020939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.021081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.021101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.985 #18 NEW cov: 11713 ft: 13764 corp: 14/347b lim: 40 exec/s: 0 rss: 68Mb L: 23/39 MS: 1 ChangeBit- 00:08:01.985 [2024-07-14 02:55:57.061270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.061298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.061432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.061457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.061594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed9d9a cdw11:29dbaa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.061612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.985 #19 NEW cov: 11713 ft: 13783 corp: 15/377b lim: 40 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 PersAutoDict- DE: "\001)\355\235\232)\333\252"- 00:08:01.985 [2024-07-14 02:55:57.101367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.101393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.101529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.101551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.101698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed9d9a cdw11:29dbaa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.101718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.985 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.985 #20 NEW cov: 11736 ft: 13824 corp: 16/407b lim: 40 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 ChangeBit- 00:08:01.985 [2024-07-14 02:55:57.151306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.151333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.151477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:90808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.151498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.985 #21 NEW cov: 11736 ft: 13858 corp: 17/429b lim: 40 exec/s: 0 rss: 68Mb L: 22/39 MS: 1 ChangeBinInt- 00:08:01.985 [2024-07-14 02:55:57.191610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808880 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.191638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.191778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.191798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.191937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed9d9a cdw11:29dbaa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.191957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.985 #22 NEW cov: 11736 ft: 13935 corp: 18/459b lim: 40 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 ChangeBit- 00:08:01.985 [2024-07-14 02:55:57.231503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808059 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.231532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.985 [2024-07-14 02:55:57.231673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:19808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.985 [2024-07-14 02:55:57.231693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.244 #23 NEW cov: 11736 ft: 13956 corp: 19/482b lim: 40 exec/s: 23 rss: 68Mb L: 23/39 MS: 1 ChangeByte- 00:08:02.244 [2024-07-14 02:55:57.272307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.244 [2024-07-14 02:55:57.272336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.244 [2024-07-14 02:55:57.272475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:8080ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.244 [2024-07-14 02:55:57.272497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.244 [2024-07-14 02:55:57.272634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.272655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.245 [2024-07-14 02:55:57.272791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff0f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.272812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.245 [2024-07-14 02:55:57.272950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:80808080 cdw11:80805b93 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.272971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.245 #24 NEW cov: 11736 ft: 14037 corp: 20/522b lim: 40 exec/s: 24 rss: 69Mb L: 40/40 MS: 1 InsertByte- 00:08:02.245 [2024-07-14 02:55:57.341849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.341886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.245 [2024-07-14 02:55:57.342047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.342074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.245 #25 NEW cov: 11736 ft: 14153 corp: 21/544b lim: 40 exec/s: 25 rss: 69Mb L: 22/40 MS: 1 ChangeBinInt- 00:08:02.245 [2024-07-14 02:55:57.412276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.412311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.245 [2024-07-14 02:55:57.412464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808008 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.412486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.245 [2024-07-14 02:55:57.412635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed9d9a cdw11:29dbaa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.412659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.245 #26 NEW cov: 11736 ft: 14251 corp: 22/574b lim: 40 exec/s: 26 rss: 69Mb L: 30/40 MS: 1 ChangeBinInt- 00:08:02.245 [2024-07-14 02:55:57.482804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.482866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.245 [2024-07-14 02:55:57.483024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.483059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.245 [2024-07-14 02:55:57.483227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:8080aa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.245 [2024-07-14 02:55:57.483260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.504 #27 NEW cov: 11736 ft: 14327 corp: 23/604b lim: 40 exec/s: 27 rss: 69Mb L: 30/40 MS: 1 CrossOver- 00:08:02.504 [2024-07-14 02:55:57.532617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.532646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.504 [2024-07-14 02:55:57.532771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.532788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.504 [2024-07-14 02:55:57.532910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed29db cdw11:aa808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.532930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.504 #28 NEW cov: 11736 ft: 14338 corp: 24/632b lim: 40 exec/s: 28 rss: 69Mb L: 28/40 MS: 1 EraseBytes- 00:08:02.504 [2024-07-14 02:55:57.572817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.572844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.504 [2024-07-14 02:55:57.572974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80800000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.572993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.504 [2024-07-14 02:55:57.573127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.573145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.504 [2024-07-14 02:55:57.573275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.573293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.504 #29 NEW cov: 11736 ft: 14355 corp: 25/671b lim: 40 exec/s: 29 rss: 69Mb L: 39/40 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:02.504 [2024-07-14 02:55:57.612702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808059 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.612729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.504 [2024-07-14 02:55:57.612869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.612885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.504 #30 NEW cov: 11736 ft: 14363 corp: 26/694b lim: 40 exec/s: 30 rss: 69Mb L: 23/40 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:02.504 [2024-07-14 02:55:57.652698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:90808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.652726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.504 [2024-07-14 02:55:57.652866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80608080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.652882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.504 #31 NEW cov: 11736 ft: 14370 corp: 27/717b lim: 40 exec/s: 31 rss: 69Mb L: 23/40 MS: 1 InsertByte- 00:08:02.504 [2024-07-14 02:55:57.692851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.692878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.504 [2024-07-14 02:55:57.693026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.504 [2024-07-14 02:55:57.693046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.504 #32 NEW cov: 11736 ft: 14375 corp: 28/739b lim: 40 exec/s: 32 rss: 69Mb L: 22/40 MS: 1 ShuffleBytes- 00:08:02.504 [2024-07-14 02:55:57.732982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.505 [2024-07-14 02:55:57.733008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.505 [2024-07-14 02:55:57.733149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80802780 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.505 [2024-07-14 02:55:57.733166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.505 #33 NEW cov: 11736 ft: 14379 corp: 29/762b lim: 40 exec/s: 33 rss: 69Mb L: 23/40 MS: 1 InsertByte- 00:08:02.764 [2024-07-14 02:55:57.773098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.773125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.773274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.773290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.764 #34 NEW cov: 11736 ft: 14387 corp: 30/783b lim: 40 exec/s: 34 rss: 69Mb L: 21/40 MS: 1 EraseBytes- 00:08:02.764 [2024-07-14 02:55:57.813013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.813040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.764 #35 NEW cov: 11736 ft: 14693 corp: 31/794b lim: 40 exec/s: 35 rss: 69Mb L: 11/40 MS: 1 EraseBytes- 00:08:02.764 [2024-07-14 02:55:57.853844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.853869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.854004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.854020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.854152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8080e8e8 cdw11:e8e8e8e8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.854169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.854311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:e8e8e8e8 cdw11:e8e8e880 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.854327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.764 #36 NEW cov: 11736 ft: 14756 corp: 32/829b lim: 40 exec/s: 36 rss: 69Mb L: 35/40 MS: 1 InsertRepeatedBytes- 00:08:02.764 [2024-07-14 02:55:57.894000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.894026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.894156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:5b808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.894172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.894269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8080e8e8 cdw11:e8e8e8e8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.894286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.894414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:e8e8e8e8 cdw11:e8e8e880 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.894429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.764 #37 NEW cov: 11736 ft: 14773 corp: 33/864b lim: 40 exec/s: 37 rss: 69Mb L: 35/40 MS: 1 ChangeByte- 00:08:02.764 [2024-07-14 02:55:57.934016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.934041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.934178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8080ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.764 [2024-07-14 02:55:57.934194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.764 [2024-07-14 02:55:57.934323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.765 [2024-07-14 02:55:57.934340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.765 [2024-07-14 02:55:57.934417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffff80 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.765 [2024-07-14 02:55:57.934432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.765 #38 NEW cov: 11739 ft: 14884 corp: 34/899b lim: 40 exec/s: 38 rss: 69Mb L: 35/40 MS: 1 EraseBytes- 00:08:02.765 [2024-07-14 02:55:57.974038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.765 [2024-07-14 02:55:57.974065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.765 [2024-07-14 02:55:57.974222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.765 [2024-07-14 02:55:57.974239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.765 [2024-07-14 02:55:57.974383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0129ed29 cdw11:dbaa8080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.765 [2024-07-14 02:55:57.974399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.765 #39 NEW cov: 11739 ft: 14927 corp: 35/927b lim: 40 exec/s: 39 rss: 69Mb L: 28/40 MS: 1 CopyPart- 00:08:02.765 [2024-07-14 02:55:58.013938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.765 [2024-07-14 02:55:58.013970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.765 [2024-07-14 02:55:58.014109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.765 [2024-07-14 02:55:58.014129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.025 #40 NEW cov: 11739 ft: 14937 corp: 36/949b lim: 40 exec/s: 40 rss: 69Mb L: 22/40 MS: 1 CrossOver- 00:08:03.025 [2024-07-14 02:55:58.054225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:82808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.054251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.054372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.054388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.054532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:8080aa80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.054549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.025 #41 NEW cov: 11739 ft: 14946 corp: 37/979b lim: 40 exec/s: 41 rss: 69Mb L: 30/40 MS: 1 ChangeBit- 00:08:03.025 [2024-07-14 02:55:58.094779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.094806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.094958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:8080ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.094974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.095105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff93ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.095122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.095249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.095264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.095359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:80808080 cdw11:80805b93 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.095376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.025 #42 NEW cov: 11739 ft: 14950 corp: 38/1019b lim: 40 exec/s: 42 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:08:03.025 [2024-07-14 02:55:58.134277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.134305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.134448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.134469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.025 #43 NEW cov: 11739 ft: 14960 corp: 39/1041b lim: 40 exec/s: 43 rss: 69Mb L: 22/40 MS: 1 CopyPart- 00:08:03.025 [2024-07-14 02:55:58.174744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.174769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.174907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.174923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.175051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29ed9d9a cdw11:29ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.175068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.175196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:dbaa8080 cdw11:80808253 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.175213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.025 #44 NEW cov: 11739 ft: 14969 corp: 40/1074b lim: 40 exec/s: 44 rss: 70Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:08:03.025 [2024-07-14 02:55:58.215100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.215125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.215264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:8080ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.215280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.215409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.215424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.215562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff0f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.215580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.025 [2024-07-14 02:55:58.215706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:80808080 cdw11:80808093 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.025 [2024-07-14 02:55:58.215722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.025 #45 NEW cov: 11739 ft: 14997 corp: 41/1114b lim: 40 exec/s: 22 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:08:03.025 #45 DONE cov: 11739 ft: 14997 corp: 41/1114b lim: 40 exec/s: 22 rss: 70Mb 00:08:03.025 ###### Recommended dictionary. ###### 00:08:03.025 "\001)\355\235\232)\333\252" # Uses: 1 00:08:03.025 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:03.025 ###### End of recommended dictionary. ###### 00:08:03.025 Done 45 runs in 2 second(s) 00:08:03.285 02:55:58 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:03.285 02:55:58 -- ../common.sh@72 -- # (( i++ )) 00:08:03.285 02:55:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.285 02:55:58 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:03.285 02:55:58 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:03.285 02:55:58 -- nvmf/run.sh@24 -- # local timen=1 00:08:03.285 02:55:58 -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.285 02:55:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:03.285 02:55:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:03.285 02:55:58 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:03.285 02:55:58 -- nvmf/run.sh@29 -- # port=4414 00:08:03.285 02:55:58 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:03.285 02:55:58 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:03.285 02:55:58 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.285 02:55:58 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:03.285 [2024-07-14 02:55:58.398311] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:03.285 [2024-07-14 02:55:58.398403] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid672775 ] 00:08:03.285 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.545 [2024-07-14 02:55:58.579024] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.545 [2024-07-14 02:55:58.598243] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:03.545 [2024-07-14 02:55:58.598358] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.545 [2024-07-14 02:55:58.649795] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.545 [2024-07-14 02:55:58.666066] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:03.545 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.545 INFO: Seed: 2979275180 00:08:03.545 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:03.545 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:03.545 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:03.545 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.545 #2 INITED exec/s: 0 rss: 60Mb 00:08:03.545 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.545 This may also happen if the target rejected all inputs we tried so far 00:08:03.545 [2024-07-14 02:55:58.711427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.545 [2024-07-14 02:55:58.711464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.545 [2024-07-14 02:55:58.711522] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.545 [2024-07-14 02:55:58.711538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.803 NEW_FUNC[1/671]: 0x4a5880 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:03.803 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.803 #3 NEW cov: 11503 ft: 11504 corp: 2/20b lim: 35 exec/s: 0 rss: 66Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:08:03.803 [2024-07-14 02:55:59.012205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.803 [2024-07-14 02:55:59.012248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.803 [2024-07-14 02:55:59.012308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.803 [2024-07-14 02:55:59.012326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.803 #4 NEW cov: 11616 ft: 12076 corp: 3/39b lim: 35 exec/s: 0 rss: 66Mb L: 19/19 MS: 1 ShuffleBytes- 00:08:04.063 [2024-07-14 02:55:59.062221] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.062250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.062307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.062322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.063 #5 NEW cov: 11622 ft: 12192 corp: 4/56b lim: 35 exec/s: 0 rss: 66Mb L: 17/19 MS: 1 EraseBytes- 00:08:04.063 [2024-07-14 02:55:59.102452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.102478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.102534] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.102548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.102602] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.102618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.063 #6 NEW cov: 11714 ft: 12741 corp: 5/79b lim: 35 exec/s: 0 rss: 66Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:04.063 [2024-07-14 02:55:59.142532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.142558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.142597] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.142612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.142665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.142680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.063 #7 NEW cov: 11714 ft: 12819 corp: 6/102b lim: 35 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 ChangeBinInt- 00:08:04.063 [2024-07-14 02:55:59.182646] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.182672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.182726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.182745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.182799] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.182814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.063 #8 NEW cov: 11714 ft: 12890 corp: 7/126b lim: 35 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertByte- 00:08:04.063 [2024-07-14 02:55:59.222838] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.222865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.222911] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.222926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.222979] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.222996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.063 #9 NEW cov: 11714 ft: 12970 corp: 8/147b lim: 35 exec/s: 0 rss: 67Mb L: 21/24 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:04.063 [2024-07-14 02:55:59.263084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.263111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.263167] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.263182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.263236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.263252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.063 NEW_FUNC[1/2]: 0x4c00f0 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:04.063 NEW_FUNC[2/2]: 0x115f9d0 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1489 00:08:04.063 #10 NEW cov: 11771 ft: 13186 corp: 9/176b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CMP- DE: "\001\004\000\000\000\000\000\000"- 00:08:04.063 [2024-07-14 02:55:59.313078] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.313107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.313162] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.313178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.063 [2024-07-14 02:55:59.313231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.063 [2024-07-14 02:55:59.313247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.323 #11 NEW cov: 11771 ft: 13211 corp: 10/197b lim: 35 exec/s: 0 rss: 67Mb L: 21/29 MS: 1 ShuffleBytes- 00:08:04.323 [2024-07-14 02:55:59.353044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.353072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.353127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.353143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.323 #12 NEW cov: 11771 ft: 13238 corp: 11/216b lim: 35 exec/s: 0 rss: 67Mb L: 19/29 MS: 1 CopyPart- 00:08:04.323 [2024-07-14 02:55:59.393290] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.393316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.393372] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.393388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.393446] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.393463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.323 #13 NEW cov: 11771 ft: 13273 corp: 12/237b lim: 35 exec/s: 0 rss: 67Mb L: 21/29 MS: 1 CMP- DE: "\000\000"- 00:08:04.323 [2024-07-14 02:55:59.433388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.433415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.433473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.433489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.433541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.433558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.323 #14 NEW cov: 11771 ft: 13319 corp: 13/258b lim: 35 exec/s: 0 rss: 67Mb L: 21/29 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:04.323 [2024-07-14 02:55:59.473489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.473522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.473579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000002b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.473593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.473645] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.473661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.323 #15 NEW cov: 11771 ft: 13365 corp: 14/281b lim: 35 exec/s: 0 rss: 67Mb L: 23/29 MS: 1 ChangeBinInt- 00:08:04.323 [2024-07-14 02:55:59.513444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.513470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.513524] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.513540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.323 #16 NEW cov: 11771 ft: 13378 corp: 15/300b lim: 35 exec/s: 0 rss: 67Mb L: 19/29 MS: 1 ChangeBit- 00:08:04.323 [2024-07-14 02:55:59.543799] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.543825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.543882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.543899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.543952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.543968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.323 [2024-07-14 02:55:59.544023] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.323 [2024-07-14 02:55:59.544038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.323 #17 NEW cov: 11771 ft: 13520 corp: 16/332b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 CrossOver- 00:08:04.583 [2024-07-14 02:55:59.583782] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.583 [2024-07-14 02:55:59.583807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.583 [2024-07-14 02:55:59.583862] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.583 [2024-07-14 02:55:59.583877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.583 [2024-07-14 02:55:59.583932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.583 [2024-07-14 02:55:59.583948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.583 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.583 #18 NEW cov: 11794 ft: 13533 corp: 17/355b lim: 35 exec/s: 0 rss: 67Mb L: 23/32 MS: 1 ChangeBinInt- 00:08:04.583 [2024-07-14 02:55:59.623763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.583 [2024-07-14 02:55:59.623792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.583 [2024-07-14 02:55:59.623845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.583 [2024-07-14 02:55:59.623861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.583 #19 NEW cov: 11794 ft: 13551 corp: 18/372b lim: 35 exec/s: 0 rss: 68Mb L: 17/32 MS: 1 ChangeBinInt- 00:08:04.583 [2024-07-14 02:55:59.664041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.583 [2024-07-14 02:55:59.664066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.583 [2024-07-14 02:55:59.664122] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.583 [2024-07-14 02:55:59.664137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.583 [2024-07-14 02:55:59.664192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.583 [2024-07-14 02:55:59.664207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.583 #20 NEW cov: 11794 ft: 13591 corp: 19/395b lim: 35 exec/s: 0 rss: 68Mb L: 23/32 MS: 1 ChangeBit- 00:08:04.584 [2024-07-14 02:55:59.704312] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.704341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.704400] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.704417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.704483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.704499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.704557] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.704571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.584 #21 NEW cov: 11794 ft: 13608 corp: 20/428b lim: 35 exec/s: 21 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:04.584 [2024-07-14 02:55:59.744482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.744510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.744568] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.744584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.744640] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.744656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.584 #22 NEW cov: 11794 ft: 13629 corp: 21/457b lim: 35 exec/s: 22 rss: 68Mb L: 29/33 MS: 1 CrossOver- 00:08:04.584 [2024-07-14 02:55:59.784562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.784588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.784649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.784669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.784729] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.784747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.784804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.784820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.584 #23 NEW cov: 11794 ft: 13663 corp: 22/488b lim: 35 exec/s: 23 rss: 68Mb L: 31/33 MS: 1 CrossOver- 00:08:04.584 [2024-07-14 02:55:59.824452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.824481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.824539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.824555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.584 [2024-07-14 02:55:59.824610] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.584 [2024-07-14 02:55:59.824626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.843 #24 NEW cov: 11794 ft: 13687 corp: 23/512b lim: 35 exec/s: 24 rss: 68Mb L: 24/33 MS: 1 CrossOver- 00:08:04.843 [2024-07-14 02:55:59.864452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.864479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.843 [2024-07-14 02:55:59.864536] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.864553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.843 #25 NEW cov: 11794 ft: 13720 corp: 24/531b lim: 35 exec/s: 25 rss: 68Mb L: 19/33 MS: 1 ChangeBinInt- 00:08:04.843 [2024-07-14 02:55:59.904879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.904907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.843 [2024-07-14 02:55:59.904966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.904980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.843 [2024-07-14 02:55:59.905035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.905050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.843 [2024-07-14 02:55:59.905106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.905120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.843 #26 NEW cov: 11794 ft: 13740 corp: 25/564b lim: 35 exec/s: 26 rss: 68Mb L: 33/33 MS: 1 CrossOver- 00:08:04.843 [2024-07-14 02:55:59.944724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.944755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.843 [2024-07-14 02:55:59.944812] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.944828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.843 #27 NEW cov: 11794 ft: 13757 corp: 26/581b lim: 35 exec/s: 27 rss: 68Mb L: 17/33 MS: 1 ShuffleBytes- 00:08:04.843 [2024-07-14 02:55:59.975084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.975109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.843 [2024-07-14 02:55:59.975169] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000013 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.975184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.843 [2024-07-14 02:55:59.975240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.843 [2024-07-14 02:55:59.975256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.844 [2024-07-14 02:55:59.975315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.844 [2024-07-14 02:55:59.975331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.844 #28 NEW cov: 11794 ft: 13763 corp: 27/613b lim: 35 exec/s: 28 rss: 68Mb L: 32/33 MS: 1 CrossOver- 00:08:04.844 [2024-07-14 02:56:00.015244] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.844 [2024-07-14 02:56:00.015274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.844 [2024-07-14 02:56:00.015329] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.844 [2024-07-14 02:56:00.015345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.844 [2024-07-14 02:56:00.015404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.844 [2024-07-14 02:56:00.015422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.844 #29 NEW cov: 11794 ft: 13782 corp: 28/642b lim: 35 exec/s: 29 rss: 68Mb L: 29/33 MS: 1 ChangeByte- 00:08:04.844 [2024-07-14 02:56:00.055388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.844 [2024-07-14 02:56:00.055415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.844 [2024-07-14 02:56:00.055480] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.844 [2024-07-14 02:56:00.055496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.844 [2024-07-14 02:56:00.055554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.844 [2024-07-14 02:56:00.055573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.844 #30 NEW cov: 11794 ft: 13793 corp: 29/671b lim: 35 exec/s: 30 rss: 68Mb L: 29/33 MS: 1 PersAutoDict- DE: "\001\004\000\000\000\000\000\000"- 00:08:05.104 [2024-07-14 02:56:00.095287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.095316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.095372] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.095390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.095448] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.095464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.104 #31 NEW cov: 11794 ft: 13797 corp: 30/694b lim: 35 exec/s: 31 rss: 68Mb L: 23/33 MS: 1 ChangeBinInt- 00:08:05.104 [2024-07-14 02:56:00.125526] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.125553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.125610] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.125627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.125687] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.125705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.125758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.125786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.104 #32 NEW cov: 11794 ft: 13827 corp: 31/725b lim: 35 exec/s: 32 rss: 68Mb L: 31/33 MS: 1 CrossOver- 00:08:05.104 [2024-07-14 02:56:00.165479] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.165504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.165563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.165579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.165634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.165650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.104 #33 NEW cov: 11794 ft: 13831 corp: 32/748b lim: 35 exec/s: 33 rss: 68Mb L: 23/33 MS: 1 ChangeByte- 00:08:05.104 [2024-07-14 02:56:00.195768] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT VECTOR CONFIGURATION cid:5 cdw10:00000009 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.195798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.195857] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.195875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.195930] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.195946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.104 NEW_FUNC[1/1]: 0x4c5fa0 in feat_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:332 00:08:05.104 #34 NEW cov: 11817 ft: 13895 corp: 33/778b lim: 35 exec/s: 34 rss: 69Mb L: 30/33 MS: 1 InsertByte- 00:08:05.104 [2024-07-14 02:56:00.235701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.235729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.235785] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.235800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.235844] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.235861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.104 #35 NEW cov: 11817 ft: 13907 corp: 34/801b lim: 35 exec/s: 35 rss: 69Mb L: 23/33 MS: 1 ChangeByte- 00:08:05.104 [2024-07-14 02:56:00.275805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.275832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.275884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.275900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.275944] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.275959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.104 #36 NEW cov: 11817 ft: 13965 corp: 35/825b lim: 35 exec/s: 36 rss: 69Mb L: 24/33 MS: 1 ChangeBit- 00:08:05.104 [2024-07-14 02:56:00.315952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.315977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.316032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.316047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.104 [2024-07-14 02:56:00.316096] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.104 [2024-07-14 02:56:00.316114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.104 #37 NEW cov: 11817 ft: 13992 corp: 36/852b lim: 35 exec/s: 37 rss: 69Mb L: 27/33 MS: 1 CopyPart- 00:08:05.364 [2024-07-14 02:56:00.355931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.355971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.356026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.356042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.364 #38 NEW cov: 11817 ft: 14050 corp: 37/871b lim: 35 exec/s: 38 rss: 69Mb L: 19/33 MS: 1 ChangeByte- 00:08:05.364 [2024-07-14 02:56:00.396218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.396246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.396299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.396314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.396366] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.396381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.364 #39 NEW cov: 11817 ft: 14063 corp: 38/893b lim: 35 exec/s: 39 rss: 69Mb L: 22/33 MS: 1 InsertByte- 00:08:05.364 [2024-07-14 02:56:00.435948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.435975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.364 #40 NEW cov: 11817 ft: 14854 corp: 39/902b lim: 35 exec/s: 40 rss: 69Mb L: 9/33 MS: 1 EraseBytes- 00:08:05.364 [2024-07-14 02:56:00.476247] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.476275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.476329] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.476344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.364 #41 NEW cov: 11817 ft: 14907 corp: 40/919b lim: 35 exec/s: 41 rss: 69Mb L: 17/33 MS: 1 ChangeBinInt- 00:08:05.364 [2024-07-14 02:56:00.516676] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.516701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.516759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.516775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.516830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.516849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.364 #42 NEW cov: 11817 ft: 14922 corp: 41/950b lim: 35 exec/s: 42 rss: 69Mb L: 31/33 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:05.364 [2024-07-14 02:56:00.556809] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.556835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.556893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.556911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.556966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000013 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.556981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.557037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.557054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.364 #43 NEW cov: 11817 ft: 14925 corp: 42/983b lim: 35 exec/s: 43 rss: 69Mb L: 33/33 MS: 1 CopyPart- 00:08:05.364 [2024-07-14 02:56:00.596928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.596957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.597017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.597035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.364 [2024-07-14 02:56:00.597091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.364 [2024-07-14 02:56:00.597106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.624 #44 NEW cov: 11817 ft: 14957 corp: 43/1012b lim: 35 exec/s: 44 rss: 69Mb L: 29/33 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:05.624 [2024-07-14 02:56:00.636874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.624 [2024-07-14 02:56:00.636900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.624 [2024-07-14 02:56:00.636956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.624 [2024-07-14 02:56:00.636972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.624 [2024-07-14 02:56:00.637027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.624 [2024-07-14 02:56:00.637043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.624 #45 NEW cov: 11817 ft: 14966 corp: 44/1039b lim: 35 exec/s: 45 rss: 69Mb L: 27/33 MS: 1 ChangeByte- 00:08:05.624 [2024-07-14 02:56:00.677094] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.624 [2024-07-14 02:56:00.677122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.624 [2024-07-14 02:56:00.677183] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.624 [2024-07-14 02:56:00.677198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.624 [2024-07-14 02:56:00.677254] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.624 [2024-07-14 02:56:00.677270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.624 [2024-07-14 02:56:00.677327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.624 [2024-07-14 02:56:00.677342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.624 #46 NEW cov: 11817 ft: 14969 corp: 45/1072b lim: 35 exec/s: 23 rss: 70Mb L: 33/33 MS: 1 ChangeBinInt- 00:08:05.624 #46 DONE cov: 11817 ft: 14969 corp: 45/1072b lim: 35 exec/s: 23 rss: 70Mb 00:08:05.624 ###### Recommended dictionary. ###### 00:08:05.624 "\377\377\377\377" # Uses: 1 00:08:05.624 "\001\004\000\000\000\000\000\000" # Uses: 1 00:08:05.624 "\000\000" # Uses: 2 00:08:05.624 ###### End of recommended dictionary. ###### 00:08:05.624 Done 46 runs in 2 second(s) 00:08:05.624 02:56:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:05.624 02:56:00 -- ../common.sh@72 -- # (( i++ )) 00:08:05.624 02:56:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.624 02:56:00 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:05.624 02:56:00 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:05.624 02:56:00 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.624 02:56:00 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.624 02:56:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:05.624 02:56:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:05.624 02:56:00 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:05.624 02:56:00 -- nvmf/run.sh@29 -- # port=4415 00:08:05.624 02:56:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:05.624 02:56:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:05.624 02:56:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.624 02:56:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:05.624 [2024-07-14 02:56:00.857227] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:05.624 [2024-07-14 02:56:00.857299] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid673314 ] 00:08:05.884 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.884 [2024-07-14 02:56:01.032013] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.884 [2024-07-14 02:56:01.051675] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.884 [2024-07-14 02:56:01.051796] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.884 [2024-07-14 02:56:01.103159] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.884 [2024-07-14 02:56:01.119462] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:05.884 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.884 INFO: Seed: 1139325327 00:08:06.143 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:06.143 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:06.143 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:06.143 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.143 #2 INITED exec/s: 0 rss: 60Mb 00:08:06.143 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.143 This may also happen if the target rejected all inputs we tried so far 00:08:06.143 [2024-07-14 02:56:01.175114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.143 [2024-07-14 02:56:01.175144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.143 [2024-07-14 02:56:01.175204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.143 [2024-07-14 02:56:01.175219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.143 [2024-07-14 02:56:01.175275] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.143 [2024-07-14 02:56:01.175290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.402 NEW_FUNC[1/671]: 0x4a6dc0 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:06.402 NEW_FUNC[2/671]: 0x4c6c20 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:06.402 #8 NEW cov: 11505 ft: 11506 corp: 2/34b lim: 35 exec/s: 0 rss: 66Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:06.402 [2024-07-14 02:56:01.485884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.485916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.402 [2024-07-14 02:56:01.485974] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.485988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.402 [2024-07-14 02:56:01.486044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.486059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.402 #19 NEW cov: 11618 ft: 12072 corp: 3/68b lim: 35 exec/s: 0 rss: 66Mb L: 34/34 MS: 1 InsertByte- 00:08:06.402 [2024-07-14 02:56:01.536067] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.536094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.402 [2024-07-14 02:56:01.536152] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.536166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.402 [2024-07-14 02:56:01.536224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.536238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.402 [2024-07-14 02:56:01.536295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.536308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.402 #25 NEW cov: 11624 ft: 12364 corp: 4/103b lim: 35 exec/s: 0 rss: 66Mb L: 35/35 MS: 1 CopyPart- 00:08:06.402 [2024-07-14 02:56:01.576000] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.576026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.402 [2024-07-14 02:56:01.576086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.576101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.402 [2024-07-14 02:56:01.576158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.402 [2024-07-14 02:56:01.576172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.402 [2024-07-14 02:56:01.576232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.403 [2024-07-14 02:56:01.576246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.403 #28 NEW cov: 11709 ft: 12620 corp: 5/133b lim: 35 exec/s: 0 rss: 66Mb L: 30/35 MS: 3 ChangeByte-CrossOver-CrossOver- 00:08:06.403 [2024-07-14 02:56:01.616335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000023d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.403 [2024-07-14 02:56:01.616361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.403 [2024-07-14 02:56:01.616421] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.403 [2024-07-14 02:56:01.616435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.403 [2024-07-14 02:56:01.616502] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.403 [2024-07-14 02:56:01.616515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.403 [2024-07-14 02:56:01.616572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.403 [2024-07-14 02:56:01.616587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.403 #29 NEW cov: 11709 ft: 12731 corp: 6/168b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ChangeByte- 00:08:06.662 [2024-07-14 02:56:01.656465] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000023d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.656493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.656554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.656569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.656628] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.656642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.656699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.656718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.662 #30 NEW cov: 11709 ft: 12793 corp: 7/203b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:06.662 [2024-07-14 02:56:01.696445] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.696470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.696532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.696546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.696603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.696618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.662 #31 NEW cov: 11709 ft: 12867 corp: 8/236b lim: 35 exec/s: 0 rss: 67Mb L: 33/35 MS: 1 ChangeBit- 00:08:06.662 [2024-07-14 02:56:01.736510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.736537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.736595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.736609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.736664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.736679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.662 #32 NEW cov: 11709 ft: 12956 corp: 9/269b lim: 35 exec/s: 0 rss: 67Mb L: 33/35 MS: 1 ChangeBit- 00:08:06.662 [2024-07-14 02:56:01.776622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.776649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.776708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.776722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.776776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.776790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.662 #33 NEW cov: 11709 ft: 13013 corp: 10/303b lim: 35 exec/s: 0 rss: 67Mb L: 34/35 MS: 1 CopyPart- 00:08:06.662 [2024-07-14 02:56:01.816571] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.816597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.816658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.816673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.662 #39 NEW cov: 11709 ft: 13485 corp: 11/324b lim: 35 exec/s: 0 rss: 67Mb L: 21/35 MS: 1 EraseBytes- 00:08:06.662 [2024-07-14 02:56:01.856729] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.856755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.662 [2024-07-14 02:56:01.856814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.662 [2024-07-14 02:56:01.856828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.662 #40 NEW cov: 11709 ft: 13574 corp: 12/349b lim: 35 exec/s: 0 rss: 67Mb L: 25/35 MS: 1 EraseBytes- 00:08:06.663 [2024-07-14 02:56:01.896827] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.663 [2024-07-14 02:56:01.896853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.663 [2024-07-14 02:56:01.896915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.663 [2024-07-14 02:56:01.896930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.922 #42 NEW cov: 11709 ft: 13601 corp: 13/371b lim: 35 exec/s: 0 rss: 67Mb L: 22/35 MS: 2 ShuffleBytes-CrossOver- 00:08:06.922 [2024-07-14 02:56:01.937232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.922 [2024-07-14 02:56:01.937259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.922 [2024-07-14 02:56:01.937319] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.922 [2024-07-14 02:56:01.937333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.922 [2024-07-14 02:56:01.937393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.922 [2024-07-14 02:56:01.937408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.922 [2024-07-14 02:56:01.937472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.922 [2024-07-14 02:56:01.937486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.922 #43 NEW cov: 11709 ft: 13617 corp: 14/406b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 CopyPart- 00:08:06.922 [2024-07-14 02:56:01.977330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.922 [2024-07-14 02:56:01.977357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.922 [2024-07-14 02:56:01.977417] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.922 [2024-07-14 02:56:01.977431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:01.977493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:01.977508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.923 #44 NEW cov: 11709 ft: 13662 corp: 15/439b lim: 35 exec/s: 0 rss: 67Mb L: 33/35 MS: 1 ChangeByte- 00:08:06.923 [2024-07-14 02:56:02.017457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.017487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:02.017548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.017562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:02.017623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.017636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.923 #45 NEW cov: 11709 ft: 13685 corp: 16/469b lim: 35 exec/s: 0 rss: 67Mb L: 30/35 MS: 1 CMP- DE: "\2572\266\267\240\355)\000"- 00:08:06.923 [2024-07-14 02:56:02.057716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.057743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:02.057804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.057819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:02.057879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.057894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:02.057950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.057965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.923 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.923 #46 NEW cov: 11732 ft: 13735 corp: 17/504b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 PersAutoDict- DE: "\2572\266\267\240\355)\000"- 00:08:06.923 [2024-07-14 02:56:02.097685] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.097712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:02.097773] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.097787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:02.097844] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.097858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.923 #47 NEW cov: 11732 ft: 13757 corp: 18/537b lim: 35 exec/s: 0 rss: 68Mb L: 33/35 MS: 1 CMP- DE: "\001\014"- 00:08:06.923 [2024-07-14 02:56:02.137701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.137727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.923 [2024-07-14 02:56:02.137787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.923 [2024-07-14 02:56:02.137804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.923 #48 NEW cov: 11732 ft: 13806 corp: 19/563b lim: 35 exec/s: 48 rss: 68Mb L: 26/35 MS: 1 InsertByte- 00:08:07.184 [2024-07-14 02:56:02.177972] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000023d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.177999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.178061] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.178077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.178138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.178152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.184 #54 NEW cov: 11732 ft: 13815 corp: 20/596b lim: 35 exec/s: 54 rss: 68Mb L: 33/35 MS: 1 ChangeBinInt- 00:08:07.184 [2024-07-14 02:56:02.218096] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.218123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.218182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.218196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.218252] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.218266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.184 NEW_FUNC[1/3]: 0x4c23c0 in feat_temperature_threshold /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:295 00:08:07.184 NEW_FUNC[2/3]: 0x1155200 in nvmf_ctrlr_get_features_temperature_threshold /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1572 00:08:07.184 #55 NEW cov: 11789 ft: 13881 corp: 21/630b lim: 35 exec/s: 55 rss: 68Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:07.184 [2024-07-14 02:56:02.258272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.258299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.258358] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.258373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.258434] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.258452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.258509] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.258523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.184 #56 NEW cov: 11789 ft: 13886 corp: 22/665b lim: 35 exec/s: 56 rss: 68Mb L: 35/35 MS: 1 PersAutoDict- DE: "\001\014"- 00:08:07.184 [2024-07-14 02:56:02.298293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.298319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.298378] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.298393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.298453] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.298468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.184 #57 NEW cov: 11789 ft: 13933 corp: 23/695b lim: 35 exec/s: 57 rss: 68Mb L: 30/35 MS: 1 ShuffleBytes- 00:08:07.184 [2024-07-14 02:56:02.338283] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.338309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.338365] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.338380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.184 #58 NEW cov: 11789 ft: 13960 corp: 24/722b lim: 35 exec/s: 58 rss: 68Mb L: 27/35 MS: 1 CopyPart- 00:08:07.184 [2024-07-14 02:56:02.378513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.378539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.184 [2024-07-14 02:56:02.378596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.184 [2024-07-14 02:56:02.378610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.185 [2024-07-14 02:56:02.378664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-14 02:56:02.378678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.185 #59 NEW cov: 11789 ft: 13964 corp: 25/755b lim: 35 exec/s: 59 rss: 68Mb L: 33/35 MS: 1 ChangeBinInt- 00:08:07.185 [2024-07-14 02:56:02.418759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-14 02:56:02.418785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.185 [2024-07-14 02:56:02.418846] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-14 02:56:02.418860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.185 [2024-07-14 02:56:02.418920] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-14 02:56:02.418934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.185 [2024-07-14 02:56:02.418995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-14 02:56:02.419009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.444 #60 NEW cov: 11789 ft: 14017 corp: 26/790b lim: 35 exec/s: 60 rss: 68Mb L: 35/35 MS: 1 CopyPart- 00:08:07.444 [2024-07-14 02:56:02.458765] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.444 [2024-07-14 02:56:02.458791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.444 [2024-07-14 02:56:02.458852] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.444 [2024-07-14 02:56:02.458866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.444 [2024-07-14 02:56:02.458922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.444 [2024-07-14 02:56:02.458936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.445 #61 NEW cov: 11789 ft: 14033 corp: 27/822b lim: 35 exec/s: 61 rss: 68Mb L: 32/35 MS: 1 EraseBytes- 00:08:07.445 [2024-07-14 02:56:02.498734] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.498759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.498817] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.498832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.445 #62 NEW cov: 11789 ft: 14039 corp: 28/844b lim: 35 exec/s: 62 rss: 68Mb L: 22/35 MS: 1 ChangeBinInt- 00:08:07.445 [2024-07-14 02:56:02.539004] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.539030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.539092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.539107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.539164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.539178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.445 #63 NEW cov: 11789 ft: 14051 corp: 29/878b lim: 35 exec/s: 63 rss: 68Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:07.445 [2024-07-14 02:56:02.579224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.579251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.579313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.579328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.579387] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.579401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.579457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.579474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.445 #64 NEW cov: 11789 ft: 14054 corp: 30/913b lim: 35 exec/s: 64 rss: 68Mb L: 35/35 MS: 1 ChangeByte- 00:08:07.445 [2024-07-14 02:56:02.619216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000023d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.619242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.619300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.619313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.619373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.619388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.445 #65 NEW cov: 11789 ft: 14062 corp: 31/946b lim: 35 exec/s: 65 rss: 68Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:07.445 [2024-07-14 02:56:02.659324] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.659350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.659411] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.659425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.445 [2024-07-14 02:56:02.659486] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-14 02:56:02.659500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.445 #66 NEW cov: 11789 ft: 14069 corp: 32/980b lim: 35 exec/s: 66 rss: 69Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:07.704 [2024-07-14 02:56:02.699492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.704 [2024-07-14 02:56:02.699518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.699581] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.699595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.699655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.699669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.705 #67 NEW cov: 11789 ft: 14079 corp: 33/1013b lim: 35 exec/s: 67 rss: 69Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:07.705 [2024-07-14 02:56:02.739523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.739549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.739609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.739622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.739687] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.739701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.705 #68 NEW cov: 11789 ft: 14085 corp: 34/1046b lim: 35 exec/s: 68 rss: 69Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:07.705 [2024-07-14 02:56:02.769508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.769533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.769591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.769605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.705 #69 NEW cov: 11789 ft: 14086 corp: 35/1067b lim: 35 exec/s: 69 rss: 69Mb L: 21/35 MS: 1 ChangeBit- 00:08:07.705 [2024-07-14 02:56:02.809929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000023d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.809954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.810013] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.810028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.810067] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.810081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.810139] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.810153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.705 #70 NEW cov: 11789 ft: 14111 corp: 36/1102b lim: 35 exec/s: 70 rss: 69Mb L: 35/35 MS: 1 ChangeBit- 00:08:07.705 [2024-07-14 02:56:02.849629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.849655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.705 #71 NEW cov: 11789 ft: 14402 corp: 37/1121b lim: 35 exec/s: 71 rss: 69Mb L: 19/35 MS: 1 EraseBytes- 00:08:07.705 [2024-07-14 02:56:02.890029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.890054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.890112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.890127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.890184] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.890198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.705 #72 NEW cov: 11789 ft: 14456 corp: 38/1151b lim: 35 exec/s: 72 rss: 69Mb L: 30/35 MS: 1 EraseBytes- 00:08:07.705 [2024-07-14 02:56:02.930129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.930155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.930216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.930230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.705 [2024-07-14 02:56:02.930290] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.705 [2024-07-14 02:56:02.930304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.705 #73 NEW cov: 11789 ft: 14457 corp: 39/1182b lim: 35 exec/s: 73 rss: 69Mb L: 31/35 MS: 1 InsertByte- 00:08:07.965 [2024-07-14 02:56:02.970176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:02.970202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:02.970262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:02.970277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:02.970336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:02.970351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:02.970409] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:02.970424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.965 #74 NEW cov: 11789 ft: 14522 corp: 40/1212b lim: 35 exec/s: 74 rss: 69Mb L: 30/35 MS: 1 ChangeBit- 00:08:07.965 [2024-07-14 02:56:03.010502] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000023d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.010527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:03.010587] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000012a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.010601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:03.010659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.010674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:03.010731] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.010745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.965 #75 NEW cov: 11789 ft: 14562 corp: 41/1247b lim: 35 exec/s: 75 rss: 69Mb L: 35/35 MS: 1 ChangeByte- 00:08:07.965 [2024-07-14 02:56:03.050228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.050256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:03.050316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.050331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:03.050389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.050403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.965 #76 NEW cov: 11789 ft: 14568 corp: 42/1271b lim: 35 exec/s: 76 rss: 69Mb L: 24/35 MS: 1 EraseBytes- 00:08:07.965 [2024-07-14 02:56:03.090683] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.090711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:03.090770] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.090787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:03.090848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.090862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.965 [2024-07-14 02:56:03.090921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.090935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.965 #77 NEW cov: 11789 ft: 14573 corp: 43/1306b lim: 35 exec/s: 77 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:08:07.965 [2024-07-14 02:56:03.130379] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000013d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.965 [2024-07-14 02:56:03.130405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.965 #78 NEW cov: 11789 ft: 14619 corp: 44/1325b lim: 35 exec/s: 39 rss: 69Mb L: 19/35 MS: 1 ShuffleBytes- 00:08:07.965 #78 DONE cov: 11789 ft: 14619 corp: 44/1325b lim: 35 exec/s: 39 rss: 69Mb 00:08:07.965 ###### Recommended dictionary. ###### 00:08:07.965 "\2572\266\267\240\355)\000" # Uses: 1 00:08:07.965 "\001\014" # Uses: 1 00:08:07.965 ###### End of recommended dictionary. ###### 00:08:07.965 Done 78 runs in 2 second(s) 00:08:08.245 02:56:03 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:08.245 02:56:03 -- ../common.sh@72 -- # (( i++ )) 00:08:08.245 02:56:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.245 02:56:03 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:08.245 02:56:03 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:08.245 02:56:03 -- nvmf/run.sh@24 -- # local timen=1 00:08:08.245 02:56:03 -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.245 02:56:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:08.245 02:56:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:08.245 02:56:03 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:08.245 02:56:03 -- nvmf/run.sh@29 -- # port=4416 00:08:08.245 02:56:03 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:08.245 02:56:03 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:08.245 02:56:03 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.245 02:56:03 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:08.245 [2024-07-14 02:56:03.311102] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:08.245 [2024-07-14 02:56:03.311171] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid673619 ] 00:08:08.245 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.245 [2024-07-14 02:56:03.489241] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.503 [2024-07-14 02:56:03.508645] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.503 [2024-07-14 02:56:03.508760] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.503 [2024-07-14 02:56:03.560179] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.503 [2024-07-14 02:56:03.576485] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:08.503 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.503 INFO: Seed: 3595314147 00:08:08.503 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:08.503 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:08.503 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:08.503 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.503 #2 INITED exec/s: 0 rss: 59Mb 00:08:08.503 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.503 This may also happen if the target rejected all inputs we tried so far 00:08:08.503 [2024-07-14 02:56:03.652204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.503 [2024-07-14 02:56:03.652248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.762 NEW_FUNC[1/671]: 0x4a8270 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:08.762 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.762 #38 NEW cov: 11594 ft: 11595 corp: 2/25b lim: 105 exec/s: 0 rss: 65Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:08.762 [2024-07-14 02:56:03.983098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65284 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.762 [2024-07-14 02:56:03.983149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.762 #39 NEW cov: 11707 ft: 12208 corp: 3/49b lim: 105 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:09.020 [2024-07-14 02:56:04.033081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:72057594021150720 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.020 [2024-07-14 02:56:04.033114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.020 #40 NEW cov: 11713 ft: 12499 corp: 4/77b lim: 105 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:09.020 [2024-07-14 02:56:04.073112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4278714368 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.020 [2024-07-14 02:56:04.073140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.020 #41 NEW cov: 11798 ft: 12727 corp: 5/101b lim: 105 exec/s: 0 rss: 67Mb L: 24/28 MS: 1 ChangeBinInt- 00:08:09.020 [2024-07-14 02:56:04.123452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18723454644125696 len:60714 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.020 [2024-07-14 02:56:04.123481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.020 #42 NEW cov: 11798 ft: 12806 corp: 6/133b lim: 105 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 CMP- DE: "B\204\341{\247\355)\000"- 00:08:09.020 [2024-07-14 02:56:04.163469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65284 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.020 [2024-07-14 02:56:04.163498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.020 #43 NEW cov: 11798 ft: 12849 corp: 7/157b lim: 105 exec/s: 0 rss: 67Mb L: 24/32 MS: 1 ChangeBit- 00:08:09.020 [2024-07-14 02:56:04.203633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:216172786392498176 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.020 [2024-07-14 02:56:04.203659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.020 #44 NEW cov: 11798 ft: 12914 corp: 8/189b lim: 105 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 PersAutoDict- DE: "\003\000\000\000\000\000\000\000"- 00:08:09.020 [2024-07-14 02:56:04.243766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4278714368 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.020 [2024-07-14 02:56:04.243795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.020 #50 NEW cov: 11798 ft: 12945 corp: 9/221b lim: 105 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 PersAutoDict- DE: "\003\000\000\000\000\000\000\000"- 00:08:09.278 [2024-07-14 02:56:04.283779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073707913215 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.278 [2024-07-14 02:56:04.283805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.278 #56 NEW cov: 11798 ft: 12964 corp: 10/246b lim: 105 exec/s: 0 rss: 67Mb L: 25/32 MS: 1 InsertByte- 00:08:09.278 [2024-07-14 02:56:04.324102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65284 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.278 [2024-07-14 02:56:04.324131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.278 [2024-07-14 02:56:04.324254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.278 [2024-07-14 02:56:04.324276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.278 #57 NEW cov: 11798 ft: 13440 corp: 11/291b lim: 105 exec/s: 0 rss: 67Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:09.278 [2024-07-14 02:56:04.374018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.278 [2024-07-14 02:56:04.374043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.278 #59 NEW cov: 11798 ft: 13534 corp: 12/318b lim: 105 exec/s: 0 rss: 67Mb L: 27/45 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:09.278 [2024-07-14 02:56:04.414284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4278714368 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.278 [2024-07-14 02:56:04.414309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.278 #60 NEW cov: 11798 ft: 13572 corp: 13/350b lim: 105 exec/s: 0 rss: 68Mb L: 32/45 MS: 1 ChangeByte- 00:08:09.278 [2024-07-14 02:56:04.454455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.278 [2024-07-14 02:56:04.454481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.278 #61 NEW cov: 11798 ft: 13586 corp: 14/375b lim: 105 exec/s: 0 rss: 68Mb L: 25/45 MS: 1 InsertByte- 00:08:09.278 [2024-07-14 02:56:04.494434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18375530908895084543 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.278 [2024-07-14 02:56:04.494473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.278 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.278 #62 NEW cov: 11821 ft: 13662 corp: 15/408b lim: 105 exec/s: 0 rss: 68Mb L: 33/45 MS: 1 PersAutoDict- DE: "\003\000\000\000\000\000\000\000"- 00:08:09.536 [2024-07-14 02:56:04.534477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65284 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.534502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.536 #63 NEW cov: 11821 ft: 13708 corp: 16/432b lim: 105 exec/s: 0 rss: 68Mb L: 24/45 MS: 1 ChangeBit- 00:08:09.536 [2024-07-14 02:56:04.575324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2459565880200659114 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.575354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.575455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.575475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.575588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.575606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.575725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.575746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.536 #67 NEW cov: 11821 ft: 14261 corp: 17/522b lim: 105 exec/s: 0 rss: 68Mb L: 90/90 MS: 4 EraseBytes-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:09.536 [2024-07-14 02:56:04.615370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2459565880200659114 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.615400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.615487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.615506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.615630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.615653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.615777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.615804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.536 #68 NEW cov: 11821 ft: 14360 corp: 18/612b lim: 105 exec/s: 68 rss: 68Mb L: 90/90 MS: 1 PersAutoDict- DE: "\003\000\000\000\000\000\000\000"- 00:08:09.536 [2024-07-14 02:56:04.665499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:15553137160186484695 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.665530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.665651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:15553137160186484695 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.665675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.665797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:15553137160186484695 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.665821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.665948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:15553137160186484695 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.665973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.536 #69 NEW cov: 11821 ft: 14368 corp: 19/704b lim: 105 exec/s: 69 rss: 68Mb L: 92/92 MS: 1 InsertRepeatedBytes- 00:08:09.536 [2024-07-14 02:56:04.705090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.705114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.536 #70 NEW cov: 11821 ft: 14439 corp: 20/730b lim: 105 exec/s: 70 rss: 68Mb L: 26/92 MS: 1 InsertByte- 00:08:09.536 [2024-07-14 02:56:04.745183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:510 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.745210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.536 #71 NEW cov: 11821 ft: 14448 corp: 21/754b lim: 105 exec/s: 71 rss: 68Mb L: 24/92 MS: 1 ChangeBinInt- 00:08:09.536 [2024-07-14 02:56:04.785728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:15553137160186484695 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.785760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.785891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:15553137160186484695 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.785917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.536 [2024-07-14 02:56:04.786051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:15553137160860205055 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-07-14 02:56:04.786074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.794 #72 NEW cov: 11821 ft: 14728 corp: 22/821b lim: 105 exec/s: 72 rss: 68Mb L: 67/92 MS: 1 CrossOver- 00:08:09.794 [2024-07-14 02:56:04.835504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4278714388 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.794 [2024-07-14 02:56:04.835531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.794 #73 NEW cov: 11821 ft: 14737 corp: 23/846b lim: 105 exec/s: 73 rss: 69Mb L: 25/92 MS: 1 InsertByte- 00:08:09.794 [2024-07-14 02:56:04.875774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:15553137160186484695 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.794 [2024-07-14 02:56:04.875805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.794 [2024-07-14 02:56:04.875936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:15553137160186484695 len:55256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.794 [2024-07-14 02:56:04.875962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.794 #74 NEW cov: 11821 ft: 14790 corp: 24/894b lim: 105 exec/s: 74 rss: 69Mb L: 48/92 MS: 1 EraseBytes- 00:08:09.794 [2024-07-14 02:56:04.925676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12100372854907134331 len:65284 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.794 [2024-07-14 02:56:04.925707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.794 #75 NEW cov: 11821 ft: 14799 corp: 25/918b lim: 105 exec/s: 75 rss: 69Mb L: 24/92 MS: 1 PersAutoDict- DE: "B\204\341{\247\355)\000"- 00:08:09.794 [2024-07-14 02:56:04.965844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4278714368 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.794 [2024-07-14 02:56:04.965871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.794 #76 NEW cov: 11821 ft: 14810 corp: 26/943b lim: 105 exec/s: 76 rss: 69Mb L: 25/92 MS: 1 InsertByte- 00:08:09.794 [2024-07-14 02:56:05.005945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:510 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.794 [2024-07-14 02:56:05.005972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.794 #77 NEW cov: 11821 ft: 14815 corp: 27/967b lim: 105 exec/s: 77 rss: 69Mb L: 24/92 MS: 1 CrossOver- 00:08:09.794 [2024-07-14 02:56:05.046156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4278714368 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.794 [2024-07-14 02:56:05.046187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.052 #78 NEW cov: 11821 ft: 14849 corp: 28/999b lim: 105 exec/s: 78 rss: 69Mb L: 32/92 MS: 1 ChangeBinInt- 00:08:10.052 [2024-07-14 02:56:05.086205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4278714368 len:65520 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.052 [2024-07-14 02:56:05.086235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.052 #79 NEW cov: 11821 ft: 14856 corp: 29/1024b lim: 105 exec/s: 79 rss: 69Mb L: 25/92 MS: 1 ChangeBit- 00:08:10.052 [2024-07-14 02:56:05.126992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2459565880200659114 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.052 [2024-07-14 02:56:05.127023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.052 [2024-07-14 02:56:05.127117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.052 [2024-07-14 02:56:05.127137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.052 [2024-07-14 02:56:05.127285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.052 [2024-07-14 02:56:05.127315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.052 [2024-07-14 02:56:05.127450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.052 [2024-07-14 02:56:05.127473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.052 #80 NEW cov: 11821 ft: 14882 corp: 30/1114b lim: 105 exec/s: 80 rss: 69Mb L: 90/92 MS: 1 ShuffleBytes- 00:08:10.052 [2024-07-14 02:56:05.166122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18014402788196352 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.052 [2024-07-14 02:56:05.166150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.052 #81 NEW cov: 11821 ft: 14888 corp: 31/1138b lim: 105 exec/s: 81 rss: 69Mb L: 24/92 MS: 1 ChangeBit- 00:08:10.052 [2024-07-14 02:56:05.206355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446492285546790911 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.052 [2024-07-14 02:56:05.206386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.052 #82 NEW cov: 11821 ft: 14931 corp: 32/1164b lim: 105 exec/s: 82 rss: 69Mb L: 26/92 MS: 1 ChangeBinInt- 00:08:10.052 [2024-07-14 02:56:05.256796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446535737730924543 len:42990 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.052 [2024-07-14 02:56:05.256821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.052 #83 NEW cov: 11821 ft: 14951 corp: 33/1196b lim: 105 exec/s: 83 rss: 69Mb L: 32/92 MS: 1 PersAutoDict- DE: "B\204\341{\247\355)\000"- 00:08:10.311 [2024-07-14 02:56:05.306990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:216172786392498176 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.307020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.311 #84 NEW cov: 11821 ft: 14960 corp: 34/1228b lim: 105 exec/s: 84 rss: 69Mb L: 32/92 MS: 1 PersAutoDict- DE: "B\204\341{\247\355)\000"- 00:08:10.311 [2024-07-14 02:56:05.347637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2459565880200659114 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.347667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.311 [2024-07-14 02:56:05.347781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.347806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.311 [2024-07-14 02:56:05.347932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.347957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.311 [2024-07-14 02:56:05.348088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.348113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.311 #85 NEW cov: 11821 ft: 14966 corp: 35/1318b lim: 105 exec/s: 85 rss: 70Mb L: 90/92 MS: 1 ShuffleBytes- 00:08:10.311 [2024-07-14 02:56:05.397065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18375530908895084543 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.397094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.311 #86 NEW cov: 11821 ft: 14990 corp: 36/1351b lim: 105 exec/s: 86 rss: 70Mb L: 33/92 MS: 1 CMP- DE: "\310\324\022\255\242\355)\000"- 00:08:10.311 [2024-07-14 02:56:05.436908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.436936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.311 #87 NEW cov: 11821 ft: 14997 corp: 37/1376b lim: 105 exec/s: 87 rss: 70Mb L: 25/92 MS: 1 CMP- DE: "\001)\355\242\264S\251\222"- 00:08:10.311 [2024-07-14 02:56:05.477251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65284 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.477281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.311 [2024-07-14 02:56:05.477417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.477440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.311 #88 NEW cov: 11821 ft: 15005 corp: 38/1421b lim: 105 exec/s: 88 rss: 70Mb L: 45/92 MS: 1 CrossOver- 00:08:10.311 [2024-07-14 02:56:05.528201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18417638459029061631 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.528231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.311 [2024-07-14 02:56:05.528323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10995706271387654296 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.528343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.311 [2024-07-14 02:56:05.528489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:10995706271387654296 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.528511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.311 [2024-07-14 02:56:05.528648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:11024811886068144280 len:510 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.311 [2024-07-14 02:56:05.528667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.311 #89 NEW cov: 11821 ft: 15067 corp: 39/1508b lim: 105 exec/s: 89 rss: 70Mb L: 87/92 MS: 1 InsertRepeatedBytes- 00:08:10.570 [2024-07-14 02:56:05.577784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446584369643978751 len:13952 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.570 [2024-07-14 02:56:05.577809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.570 #90 NEW cov: 11821 ft: 15082 corp: 40/1533b lim: 105 exec/s: 90 rss: 70Mb L: 25/92 MS: 1 CMP- DE: "n\277\024\2106\177\000\000"- 00:08:10.570 [2024-07-14 02:56:05.617886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4278714368 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.570 [2024-07-14 02:56:05.617915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.570 #91 NEW cov: 11821 ft: 15090 corp: 41/1558b lim: 105 exec/s: 45 rss: 70Mb L: 25/92 MS: 1 CrossOver- 00:08:10.570 #91 DONE cov: 11821 ft: 15090 corp: 41/1558b lim: 105 exec/s: 45 rss: 70Mb 00:08:10.570 ###### Recommended dictionary. ###### 00:08:10.570 "\003\000\000\000\000\000\000\000" # Uses: 4 00:08:10.570 "B\204\341{\247\355)\000" # Uses: 3 00:08:10.570 "\310\324\022\255\242\355)\000" # Uses: 0 00:08:10.570 "\001)\355\242\264S\251\222" # Uses: 0 00:08:10.570 "n\277\024\2106\177\000\000" # Uses: 0 00:08:10.570 ###### End of recommended dictionary. ###### 00:08:10.570 Done 91 runs in 2 second(s) 00:08:10.570 02:56:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:10.570 02:56:05 -- ../common.sh@72 -- # (( i++ )) 00:08:10.570 02:56:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.570 02:56:05 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:10.570 02:56:05 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:10.570 02:56:05 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.570 02:56:05 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.570 02:56:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:10.570 02:56:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:10.570 02:56:05 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:10.570 02:56:05 -- nvmf/run.sh@29 -- # port=4417 00:08:10.570 02:56:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:10.570 02:56:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:10.570 02:56:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.570 02:56:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:10.570 [2024-07-14 02:56:05.798550] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:10.570 [2024-07-14 02:56:05.798617] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid674150 ] 00:08:10.829 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.829 [2024-07-14 02:56:05.974565] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.829 [2024-07-14 02:56:05.993689] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.829 [2024-07-14 02:56:05.993804] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.829 [2024-07-14 02:56:06.045128] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.829 [2024-07-14 02:56:06.061379] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:10.829 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.829 INFO: Seed: 1785353660 00:08:11.089 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:11.089 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:11.089 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:11.089 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.089 #2 INITED exec/s: 0 rss: 60Mb 00:08:11.089 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.089 This may also happen if the target rejected all inputs we tried so far 00:08:11.089 [2024-07-14 02:56:06.106115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.089 [2024-07-14 02:56:06.106150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.089 [2024-07-14 02:56:06.106182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.089 [2024-07-14 02:56:06.106200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.089 [2024-07-14 02:56:06.106234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.089 [2024-07-14 02:56:06.106250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.348 NEW_FUNC[1/672]: 0x4ab560 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:11.348 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.348 #8 NEW cov: 11615 ft: 11616 corp: 2/95b lim: 120 exec/s: 0 rss: 66Mb L: 94/94 MS: 1 InsertRepeatedBytes- 00:08:11.348 [2024-07-14 02:56:06.426954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.348 [2024-07-14 02:56:06.426992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.348 [2024-07-14 02:56:06.427025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.348 [2024-07-14 02:56:06.427043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.348 [2024-07-14 02:56:06.427071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.348 [2024-07-14 02:56:06.427086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.348 [2024-07-14 02:56:06.427113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.348 [2024-07-14 02:56:06.427129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.348 #14 NEW cov: 11728 ft: 12432 corp: 3/207b lim: 120 exec/s: 0 rss: 66Mb L: 112/112 MS: 1 InsertRepeatedBytes- 00:08:11.348 [2024-07-14 02:56:06.496929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.348 [2024-07-14 02:56:06.496963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.349 [2024-07-14 02:56:06.496996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.349 [2024-07-14 02:56:06.497013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.349 #15 NEW cov: 11734 ft: 13054 corp: 4/265b lim: 120 exec/s: 0 rss: 66Mb L: 58/112 MS: 1 EraseBytes- 00:08:11.349 [2024-07-14 02:56:06.557069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.349 [2024-07-14 02:56:06.557103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.349 [2024-07-14 02:56:06.557138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.349 [2024-07-14 02:56:06.557158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.608 #16 NEW cov: 11819 ft: 13352 corp: 5/323b lim: 120 exec/s: 0 rss: 66Mb L: 58/112 MS: 1 ChangeByte- 00:08:11.608 [2024-07-14 02:56:06.627292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.627326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.608 [2024-07-14 02:56:06.627366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.627384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.608 #17 NEW cov: 11819 ft: 13448 corp: 6/381b lim: 120 exec/s: 0 rss: 67Mb L: 58/112 MS: 1 ChangeByte- 00:08:11.608 [2024-07-14 02:56:06.677420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.677461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.608 [2024-07-14 02:56:06.677496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.677514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.608 #18 NEW cov: 11819 ft: 13601 corp: 7/439b lim: 120 exec/s: 0 rss: 67Mb L: 58/112 MS: 1 ShuffleBytes- 00:08:11.608 [2024-07-14 02:56:06.747658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.747689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.608 [2024-07-14 02:56:06.747721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.747738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.608 [2024-07-14 02:56:06.747766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.747782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.608 #24 NEW cov: 11819 ft: 13710 corp: 8/519b lim: 120 exec/s: 0 rss: 67Mb L: 80/112 MS: 1 InsertRepeatedBytes- 00:08:11.608 [2024-07-14 02:56:06.797718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.797751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.608 [2024-07-14 02:56:06.797785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.797802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.608 #25 NEW cov: 11819 ft: 13806 corp: 9/577b lim: 120 exec/s: 0 rss: 67Mb L: 58/112 MS: 1 ShuffleBytes- 00:08:11.608 [2024-07-14 02:56:06.847909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.847940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.608 [2024-07-14 02:56:06.847971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.847987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.608 [2024-07-14 02:56:06.848016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.848032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.608 [2024-07-14 02:56:06.848063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.608 [2024-07-14 02:56:06.848080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.868 #26 NEW cov: 11819 ft: 13850 corp: 10/695b lim: 120 exec/s: 0 rss: 67Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:08:11.868 [2024-07-14 02:56:06.907959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:06.907989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.868 [2024-07-14 02:56:06.908021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:13874926267052605884 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:06.908038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.868 #27 NEW cov: 11819 ft: 13931 corp: 11/761b lim: 120 exec/s: 0 rss: 67Mb L: 66/118 MS: 1 CMP- DE: "A\274\300\215\243\355)\000"- 00:08:11.868 [2024-07-14 02:56:06.968289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:06.968320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.868 [2024-07-14 02:56:06.968352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:06.968370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.868 [2024-07-14 02:56:06.968400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:06.968417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.868 [2024-07-14 02:56:06.968455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:06.968473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.868 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.868 #28 NEW cov: 11842 ft: 14020 corp: 12/880b lim: 120 exec/s: 0 rss: 67Mb L: 119/119 MS: 1 CopyPart- 00:08:11.868 [2024-07-14 02:56:07.018383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:07.018413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.868 [2024-07-14 02:56:07.018450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:07.018468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.868 [2024-07-14 02:56:07.018497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:07.018513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.868 [2024-07-14 02:56:07.018541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:07.018557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.868 #29 NEW cov: 11842 ft: 14051 corp: 13/998b lim: 120 exec/s: 0 rss: 67Mb L: 118/119 MS: 1 PersAutoDict- DE: "A\274\300\215\243\355)\000"- 00:08:11.868 [2024-07-14 02:56:07.078431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:07.078467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.868 [2024-07-14 02:56:07.078500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:13874926267052605884 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.868 [2024-07-14 02:56:07.078516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.128 #30 NEW cov: 11842 ft: 14101 corp: 14/1064b lim: 120 exec/s: 30 rss: 68Mb L: 66/119 MS: 1 ChangeBinInt- 00:08:12.128 [2024-07-14 02:56:07.138681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.128 [2024-07-14 02:56:07.138711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.128 [2024-07-14 02:56:07.138741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.128 [2024-07-14 02:56:07.138758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.129 [2024-07-14 02:56:07.138785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.138801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.129 [2024-07-14 02:56:07.138828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:105 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.138844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.129 #31 NEW cov: 11842 ft: 14119 corp: 15/1164b lim: 120 exec/s: 31 rss: 68Mb L: 100/119 MS: 1 CrossOver- 00:08:12.129 [2024-07-14 02:56:07.188734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.188765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.129 [2024-07-14 02:56:07.188797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.188814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.129 #32 NEW cov: 11842 ft: 14249 corp: 16/1222b lim: 120 exec/s: 32 rss: 68Mb L: 58/119 MS: 1 CrossOver- 00:08:12.129 [2024-07-14 02:56:07.238853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.238884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.129 [2024-07-14 02:56:07.238915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.238932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.129 #33 NEW cov: 11842 ft: 14253 corp: 17/1274b lim: 120 exec/s: 33 rss: 68Mb L: 52/119 MS: 1 EraseBytes- 00:08:12.129 [2024-07-14 02:56:07.289015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.289049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.129 [2024-07-14 02:56:07.289081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:13874926267052605884 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.289098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.129 #39 NEW cov: 11842 ft: 14264 corp: 18/1340b lim: 120 exec/s: 39 rss: 68Mb L: 66/119 MS: 1 ChangeBinInt- 00:08:12.129 [2024-07-14 02:56:07.349124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.349154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.129 [2024-07-14 02:56:07.349186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:13874926267052605884 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.129 [2024-07-14 02:56:07.349203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.389 #40 NEW cov: 11842 ft: 14282 corp: 19/1405b lim: 120 exec/s: 40 rss: 68Mb L: 65/119 MS: 1 EraseBytes- 00:08:12.389 [2024-07-14 02:56:07.399246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.389 [2024-07-14 02:56:07.399276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.389 [2024-07-14 02:56:07.399308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.389 [2024-07-14 02:56:07.399326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.389 #41 NEW cov: 11842 ft: 14316 corp: 20/1471b lim: 120 exec/s: 41 rss: 68Mb L: 66/119 MS: 1 PersAutoDict- DE: "A\274\300\215\243\355)\000"- 00:08:12.389 [2024-07-14 02:56:07.449403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.389 [2024-07-14 02:56:07.449433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.389 [2024-07-14 02:56:07.449487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.389 [2024-07-14 02:56:07.449506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.389 #42 NEW cov: 11842 ft: 14337 corp: 21/1537b lim: 120 exec/s: 42 rss: 68Mb L: 66/119 MS: 1 ChangeBinInt- 00:08:12.390 [2024-07-14 02:56:07.509765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.509796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.390 [2024-07-14 02:56:07.509827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.509845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.390 [2024-07-14 02:56:07.509875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.509892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.390 [2024-07-14 02:56:07.509920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.509941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.390 [2024-07-14 02:56:07.509969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.509985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.390 #43 NEW cov: 11842 ft: 14379 corp: 22/1657b lim: 120 exec/s: 43 rss: 68Mb L: 120/120 MS: 1 InsertByte- 00:08:12.390 [2024-07-14 02:56:07.579936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.579966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.390 [2024-07-14 02:56:07.579997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:114797637560320 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.580014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.390 [2024-07-14 02:56:07.580042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.580058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.390 [2024-07-14 02:56:07.580085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.580100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.390 [2024-07-14 02:56:07.580126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.580142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.390 #44 NEW cov: 11842 ft: 14429 corp: 23/1777b lim: 120 exec/s: 44 rss: 68Mb L: 120/120 MS: 1 CrossOver- 00:08:12.390 [2024-07-14 02:56:07.639881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:114795885887488 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.390 [2024-07-14 02:56:07.639912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.649 #45 NEW cov: 11842 ft: 15281 corp: 24/1816b lim: 120 exec/s: 45 rss: 68Mb L: 39/120 MS: 1 EraseBytes- 00:08:12.649 [2024-07-14 02:56:07.700183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.700215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.700248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.700266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.700296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.700314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.700345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.700362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.649 #46 NEW cov: 11842 ft: 15305 corp: 25/1934b lim: 120 exec/s: 46 rss: 68Mb L: 118/120 MS: 1 ChangeBinInt- 00:08:12.649 [2024-07-14 02:56:07.760289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.760320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.760351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:4244438268 len:16829 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.760368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.760396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15318879334071560296 len:3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.760412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.649 #47 NEW cov: 11842 ft: 15331 corp: 26/2008b lim: 120 exec/s: 47 rss: 68Mb L: 74/120 MS: 1 InsertRepeatedBytes- 00:08:12.649 [2024-07-14 02:56:07.820512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188075855872 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.820543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.820576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.820594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.820622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.820638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.820666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:105 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.820681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.649 #48 NEW cov: 11842 ft: 15345 corp: 27/2108b lim: 120 exec/s: 48 rss: 69Mb L: 100/120 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:12.649 [2024-07-14 02:56:07.880600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.880631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.649 [2024-07-14 02:56:07.880663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.649 [2024-07-14 02:56:07.880680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.909 #49 NEW cov: 11842 ft: 15357 corp: 28/2166b lim: 120 exec/s: 49 rss: 69Mb L: 58/120 MS: 1 ShuffleBytes- 00:08:12.909 [2024-07-14 02:56:07.930806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:07.930837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.909 [2024-07-14 02:56:07.930869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:07.930889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.909 [2024-07-14 02:56:07.930918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:07.930934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.909 [2024-07-14 02:56:07.930960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:07.930976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.909 #50 NEW cov: 11842 ft: 15376 corp: 29/2284b lim: 120 exec/s: 50 rss: 69Mb L: 118/120 MS: 1 ChangeBinInt- 00:08:12.909 [2024-07-14 02:56:07.990927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:07.990959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.909 [2024-07-14 02:56:07.990992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:07.991009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.909 [2024-07-14 02:56:07.991039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:07.991056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.909 #51 NEW cov: 11842 ft: 15463 corp: 30/2365b lim: 120 exec/s: 51 rss: 69Mb L: 81/120 MS: 1 InsertByte- 00:08:12.909 [2024-07-14 02:56:08.051098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:08.051128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.909 [2024-07-14 02:56:08.051160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:08.051178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.909 [2024-07-14 02:56:08.051206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.909 [2024-07-14 02:56:08.051222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.909 #52 NEW cov: 11842 ft: 15465 corp: 31/2443b lim: 120 exec/s: 26 rss: 69Mb L: 78/120 MS: 1 EraseBytes- 00:08:12.909 #52 DONE cov: 11842 ft: 15465 corp: 31/2443b lim: 120 exec/s: 26 rss: 69Mb 00:08:12.909 ###### Recommended dictionary. ###### 00:08:12.909 "A\274\300\215\243\355)\000" # Uses: 4 00:08:12.909 "\002\000\000\000\000\000\000\000" # Uses: 0 00:08:12.909 ###### End of recommended dictionary. ###### 00:08:12.909 Done 52 runs in 2 second(s) 00:08:13.169 02:56:08 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:13.169 02:56:08 -- ../common.sh@72 -- # (( i++ )) 00:08:13.169 02:56:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.169 02:56:08 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:13.169 02:56:08 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:13.169 02:56:08 -- nvmf/run.sh@24 -- # local timen=1 00:08:13.169 02:56:08 -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.169 02:56:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:13.169 02:56:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:13.169 02:56:08 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:13.169 02:56:08 -- nvmf/run.sh@29 -- # port=4418 00:08:13.169 02:56:08 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:13.169 02:56:08 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:13.169 02:56:08 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.169 02:56:08 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:13.169 [2024-07-14 02:56:08.255198] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:13.169 [2024-07-14 02:56:08.255337] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid674621 ] 00:08:13.169 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.429 [2024-07-14 02:56:08.444001] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.429 [2024-07-14 02:56:08.463232] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.429 [2024-07-14 02:56:08.463346] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.429 [2024-07-14 02:56:08.514658] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.429 [2024-07-14 02:56:08.530958] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:13.429 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.429 INFO: Seed: 4254364263 00:08:13.429 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:13.429 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:13.429 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:13.429 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.429 #2 INITED exec/s: 0 rss: 58Mb 00:08:13.429 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.429 This may also happen if the target rejected all inputs we tried so far 00:08:13.429 [2024-07-14 02:56:08.576243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.429 [2024-07-14 02:56:08.576271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.429 [2024-07-14 02:56:08.576304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:13.429 [2024-07-14 02:56:08.576318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.429 [2024-07-14 02:56:08.576364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:13.429 [2024-07-14 02:56:08.576379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.429 [2024-07-14 02:56:08.576427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:13.429 [2024-07-14 02:56:08.576440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.688 NEW_FUNC[1/670]: 0x4aedc0 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:13.688 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.688 #7 NEW cov: 11559 ft: 11560 corp: 2/82b lim: 100 exec/s: 0 rss: 65Mb L: 81/81 MS: 5 CrossOver-CopyPart-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:13.688 [2024-07-14 02:56:08.896902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.688 [2024-07-14 02:56:08.896944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.688 [2024-07-14 02:56:08.897004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:13.688 [2024-07-14 02:56:08.897022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.688 #23 NEW cov: 11672 ft: 12314 corp: 3/124b lim: 100 exec/s: 0 rss: 65Mb L: 42/81 MS: 1 EraseBytes- 00:08:13.948 [2024-07-14 02:56:08.946914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.948 [2024-07-14 02:56:08.946944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.948 [2024-07-14 02:56:08.946993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:13.948 [2024-07-14 02:56:08.947007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.948 #24 NEW cov: 11678 ft: 12573 corp: 4/166b lim: 100 exec/s: 0 rss: 65Mb L: 42/81 MS: 1 CrossOver- 00:08:13.948 [2024-07-14 02:56:08.987212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.948 [2024-07-14 02:56:08.987238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.948 [2024-07-14 02:56:08.987275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:13.948 [2024-07-14 02:56:08.987287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.948 [2024-07-14 02:56:08.987332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:13.948 [2024-07-14 02:56:08.987347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.948 [2024-07-14 02:56:08.987395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:13.948 [2024-07-14 02:56:08.987409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.948 #30 NEW cov: 11763 ft: 12879 corp: 5/248b lim: 100 exec/s: 0 rss: 65Mb L: 82/82 MS: 1 InsertByte- 00:08:13.948 [2024-07-14 02:56:09.027061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.948 [2024-07-14 02:56:09.027087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.948 #35 NEW cov: 11763 ft: 13241 corp: 6/268b lim: 100 exec/s: 0 rss: 65Mb L: 20/82 MS: 5 InsertRepeatedBytes-InsertByte-ShuffleBytes-ChangeBinInt-InsertByte- 00:08:13.949 [2024-07-14 02:56:09.067503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.949 [2024-07-14 02:56:09.067529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.949 [2024-07-14 02:56:09.067562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:13.949 [2024-07-14 02:56:09.067576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.949 [2024-07-14 02:56:09.067625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:13.949 [2024-07-14 02:56:09.067639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.949 [2024-07-14 02:56:09.067687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:13.949 [2024-07-14 02:56:09.067701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.949 #36 NEW cov: 11763 ft: 13382 corp: 7/358b lim: 100 exec/s: 0 rss: 65Mb L: 90/90 MS: 1 CopyPart- 00:08:13.949 [2024-07-14 02:56:09.107555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.949 [2024-07-14 02:56:09.107581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.949 [2024-07-14 02:56:09.107627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:13.949 [2024-07-14 02:56:09.107642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.949 [2024-07-14 02:56:09.107692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:13.949 [2024-07-14 02:56:09.107706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.949 [2024-07-14 02:56:09.107754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:13.949 [2024-07-14 02:56:09.107766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.949 #37 NEW cov: 11763 ft: 13452 corp: 8/440b lim: 100 exec/s: 0 rss: 65Mb L: 82/90 MS: 1 ChangeByte- 00:08:13.949 [2024-07-14 02:56:09.147394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.949 [2024-07-14 02:56:09.147421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.949 #42 NEW cov: 11763 ft: 13507 corp: 9/477b lim: 100 exec/s: 0 rss: 65Mb L: 37/90 MS: 5 CrossOver-InsertByte-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:13.949 [2024-07-14 02:56:09.187530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:13.949 [2024-07-14 02:56:09.187556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.209 #43 NEW cov: 11763 ft: 13532 corp: 10/514b lim: 100 exec/s: 0 rss: 65Mb L: 37/90 MS: 1 ShuffleBytes- 00:08:14.209 [2024-07-14 02:56:09.227918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.209 [2024-07-14 02:56:09.227945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.227981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.210 [2024-07-14 02:56:09.227996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.228045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.210 [2024-07-14 02:56:09.228059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.228108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.210 [2024-07-14 02:56:09.228122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.210 #44 NEW cov: 11763 ft: 13580 corp: 11/604b lim: 100 exec/s: 0 rss: 66Mb L: 90/90 MS: 1 ChangeBinInt- 00:08:14.210 [2024-07-14 02:56:09.267751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.210 [2024-07-14 02:56:09.267776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.210 #46 NEW cov: 11763 ft: 13615 corp: 12/635b lim: 100 exec/s: 0 rss: 66Mb L: 31/90 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:14.210 [2024-07-14 02:56:09.298148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.210 [2024-07-14 02:56:09.298173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.298212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.210 [2024-07-14 02:56:09.298226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.298274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.210 [2024-07-14 02:56:09.298287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.298336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.210 [2024-07-14 02:56:09.298351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.210 #47 NEW cov: 11763 ft: 13633 corp: 13/725b lim: 100 exec/s: 0 rss: 66Mb L: 90/90 MS: 1 ChangeBit- 00:08:14.210 [2024-07-14 02:56:09.337921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.210 [2024-07-14 02:56:09.337947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.210 #48 NEW cov: 11763 ft: 13721 corp: 14/759b lim: 100 exec/s: 0 rss: 66Mb L: 34/90 MS: 1 EraseBytes- 00:08:14.210 [2024-07-14 02:56:09.378387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.210 [2024-07-14 02:56:09.378414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.378466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.210 [2024-07-14 02:56:09.378481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.378529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.210 [2024-07-14 02:56:09.378545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.378596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.210 [2024-07-14 02:56:09.378610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.210 #50 NEW cov: 11763 ft: 13737 corp: 15/849b lim: 100 exec/s: 0 rss: 66Mb L: 90/90 MS: 2 CopyPart-CrossOver- 00:08:14.210 [2024-07-14 02:56:09.408151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.210 [2024-07-14 02:56:09.408178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.210 #51 NEW cov: 11763 ft: 13783 corp: 16/887b lim: 100 exec/s: 0 rss: 66Mb L: 38/90 MS: 1 InsertByte- 00:08:14.210 [2024-07-14 02:56:09.448587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.210 [2024-07-14 02:56:09.448614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.448650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.210 [2024-07-14 02:56:09.448663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.448712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.210 [2024-07-14 02:56:09.448727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.210 [2024-07-14 02:56:09.448776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.210 [2024-07-14 02:56:09.448793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.470 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.470 #52 NEW cov: 11786 ft: 13832 corp: 17/970b lim: 100 exec/s: 0 rss: 66Mb L: 83/90 MS: 1 InsertByte- 00:08:14.470 [2024-07-14 02:56:09.488411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.470 [2024-07-14 02:56:09.488437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.470 #53 NEW cov: 11786 ft: 13898 corp: 18/996b lim: 100 exec/s: 0 rss: 66Mb L: 26/90 MS: 1 EraseBytes- 00:08:14.470 [2024-07-14 02:56:09.528621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.470 [2024-07-14 02:56:09.528646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.470 [2024-07-14 02:56:09.528696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.471 [2024-07-14 02:56:09.528711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.471 #54 NEW cov: 11786 ft: 13919 corp: 19/1053b lim: 100 exec/s: 0 rss: 67Mb L: 57/90 MS: 1 CopyPart- 00:08:14.471 [2024-07-14 02:56:09.568953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.471 [2024-07-14 02:56:09.568979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.471 [2024-07-14 02:56:09.569023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.471 [2024-07-14 02:56:09.569038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.471 [2024-07-14 02:56:09.569088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.471 [2024-07-14 02:56:09.569102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.471 [2024-07-14 02:56:09.569150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.471 [2024-07-14 02:56:09.569164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.471 #55 NEW cov: 11786 ft: 13929 corp: 20/1146b lim: 100 exec/s: 55 rss: 67Mb L: 93/93 MS: 1 InsertRepeatedBytes- 00:08:14.471 [2024-07-14 02:56:09.598760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.471 [2024-07-14 02:56:09.598785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.471 #56 NEW cov: 11786 ft: 13974 corp: 21/1174b lim: 100 exec/s: 56 rss: 67Mb L: 28/93 MS: 1 EraseBytes- 00:08:14.471 [2024-07-14 02:56:09.638838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.471 [2024-07-14 02:56:09.638863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.471 #57 NEW cov: 11786 ft: 13993 corp: 22/1211b lim: 100 exec/s: 57 rss: 67Mb L: 37/93 MS: 1 CrossOver- 00:08:14.471 [2024-07-14 02:56:09.669232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.471 [2024-07-14 02:56:09.669258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.471 [2024-07-14 02:56:09.669301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.471 [2024-07-14 02:56:09.669317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.471 [2024-07-14 02:56:09.669368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.471 [2024-07-14 02:56:09.669383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.471 [2024-07-14 02:56:09.669433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.471 [2024-07-14 02:56:09.669454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.471 #58 NEW cov: 11786 ft: 14009 corp: 23/1301b lim: 100 exec/s: 58 rss: 67Mb L: 90/93 MS: 1 CrossOver- 00:08:14.471 [2024-07-14 02:56:09.699071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.471 [2024-07-14 02:56:09.699097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.471 [2024-07-14 02:56:09.699131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.471 [2024-07-14 02:56:09.699145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.731 #59 NEW cov: 11786 ft: 14017 corp: 24/1349b lim: 100 exec/s: 59 rss: 67Mb L: 48/93 MS: 1 EraseBytes- 00:08:14.731 [2024-07-14 02:56:09.739156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.731 [2024-07-14 02:56:09.739182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.731 #60 NEW cov: 11786 ft: 14037 corp: 25/1382b lim: 100 exec/s: 60 rss: 67Mb L: 33/93 MS: 1 EraseBytes- 00:08:14.731 [2024-07-14 02:56:09.779253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.731 [2024-07-14 02:56:09.779279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.731 #61 NEW cov: 11786 ft: 14061 corp: 26/1415b lim: 100 exec/s: 61 rss: 67Mb L: 33/93 MS: 1 CopyPart- 00:08:14.731 [2024-07-14 02:56:09.819632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.731 [2024-07-14 02:56:09.819658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.819691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.731 [2024-07-14 02:56:09.819705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.819753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.731 [2024-07-14 02:56:09.819768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.731 #62 NEW cov: 11786 ft: 14291 corp: 27/1480b lim: 100 exec/s: 62 rss: 67Mb L: 65/93 MS: 1 CrossOver- 00:08:14.731 [2024-07-14 02:56:09.859810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.731 [2024-07-14 02:56:09.859837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.859876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.731 [2024-07-14 02:56:09.859891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.859938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.731 [2024-07-14 02:56:09.859953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.860003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.731 [2024-07-14 02:56:09.860019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.731 #68 NEW cov: 11786 ft: 14306 corp: 28/1573b lim: 100 exec/s: 68 rss: 67Mb L: 93/93 MS: 1 CopyPart- 00:08:14.731 [2024-07-14 02:56:09.889580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.731 [2024-07-14 02:56:09.889606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.731 #69 NEW cov: 11786 ft: 14337 corp: 29/1610b lim: 100 exec/s: 69 rss: 67Mb L: 37/93 MS: 1 ChangeByte- 00:08:14.731 [2024-07-14 02:56:09.919997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.731 [2024-07-14 02:56:09.920024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.920059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.731 [2024-07-14 02:56:09.920073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.920123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.731 [2024-07-14 02:56:09.920140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.920189] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.731 [2024-07-14 02:56:09.920204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.731 #70 NEW cov: 11786 ft: 14363 corp: 30/1707b lim: 100 exec/s: 70 rss: 67Mb L: 97/97 MS: 1 InsertRepeatedBytes- 00:08:14.731 [2024-07-14 02:56:09.959967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.731 [2024-07-14 02:56:09.959993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.960031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.731 [2024-07-14 02:56:09.960046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.731 [2024-07-14 02:56:09.960096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.731 [2024-07-14 02:56:09.960110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.991 #71 NEW cov: 11786 ft: 14376 corp: 31/1776b lim: 100 exec/s: 71 rss: 68Mb L: 69/97 MS: 1 EraseBytes- 00:08:14.991 [2024-07-14 02:56:10.000181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.991 [2024-07-14 02:56:10.000208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.000254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.991 [2024-07-14 02:56:10.000268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.000318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.991 [2024-07-14 02:56:10.000332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.000383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.991 [2024-07-14 02:56:10.000398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.991 #72 NEW cov: 11786 ft: 14387 corp: 32/1870b lim: 100 exec/s: 72 rss: 68Mb L: 94/97 MS: 1 InsertRepeatedBytes- 00:08:14.991 [2024-07-14 02:56:10.040314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.991 [2024-07-14 02:56:10.040343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.040378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.991 [2024-07-14 02:56:10.040393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.040447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.991 [2024-07-14 02:56:10.040462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.040512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.991 [2024-07-14 02:56:10.040526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.991 #73 NEW cov: 11786 ft: 14423 corp: 33/1967b lim: 100 exec/s: 73 rss: 68Mb L: 97/97 MS: 1 InsertRepeatedBytes- 00:08:14.991 [2024-07-14 02:56:10.080420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.991 [2024-07-14 02:56:10.080451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.080494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.991 [2024-07-14 02:56:10.080509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.080560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.991 [2024-07-14 02:56:10.080576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.080626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.991 [2024-07-14 02:56:10.080641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.991 #74 NEW cov: 11786 ft: 14512 corp: 34/2058b lim: 100 exec/s: 74 rss: 68Mb L: 91/97 MS: 1 InsertRepeatedBytes- 00:08:14.991 [2024-07-14 02:56:10.120578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.991 [2024-07-14 02:56:10.120615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.120662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.991 [2024-07-14 02:56:10.120677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.120727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.991 [2024-07-14 02:56:10.120741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.120790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.991 [2024-07-14 02:56:10.120803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.991 #75 NEW cov: 11786 ft: 14519 corp: 35/2140b lim: 100 exec/s: 75 rss: 68Mb L: 82/97 MS: 1 CrossOver- 00:08:14.991 [2024-07-14 02:56:10.150703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.991 [2024-07-14 02:56:10.150728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.150767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.991 [2024-07-14 02:56:10.150781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.150831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.991 [2024-07-14 02:56:10.150846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.150897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.991 [2024-07-14 02:56:10.150911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.991 #76 NEW cov: 11786 ft: 14529 corp: 36/2226b lim: 100 exec/s: 76 rss: 68Mb L: 86/97 MS: 1 CrossOver- 00:08:14.991 [2024-07-14 02:56:10.190766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.991 [2024-07-14 02:56:10.190792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.190835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:14.991 [2024-07-14 02:56:10.190848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.190898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:14.991 [2024-07-14 02:56:10.190912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.991 [2024-07-14 02:56:10.190961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:14.991 [2024-07-14 02:56:10.190976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.991 #77 NEW cov: 11786 ft: 14543 corp: 37/2317b lim: 100 exec/s: 77 rss: 68Mb L: 91/97 MS: 1 InsertByte- 00:08:14.991 [2024-07-14 02:56:10.230538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:14.991 [2024-07-14 02:56:10.230565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.251 #78 NEW cov: 11786 ft: 14613 corp: 38/2354b lim: 100 exec/s: 78 rss: 68Mb L: 37/97 MS: 1 ChangeBit- 00:08:15.251 [2024-07-14 02:56:10.270746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.251 [2024-07-14 02:56:10.270772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.251 [2024-07-14 02:56:10.270810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.251 [2024-07-14 02:56:10.270825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.251 #79 NEW cov: 11786 ft: 14618 corp: 39/2405b lim: 100 exec/s: 79 rss: 68Mb L: 51/97 MS: 1 CopyPart- 00:08:15.251 [2024-07-14 02:56:10.310972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.251 [2024-07-14 02:56:10.310998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.251 [2024-07-14 02:56:10.311033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.251 [2024-07-14 02:56:10.311046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.251 [2024-07-14 02:56:10.311096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.251 [2024-07-14 02:56:10.311110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.251 #80 NEW cov: 11786 ft: 14619 corp: 40/2483b lim: 100 exec/s: 80 rss: 68Mb L: 78/97 MS: 1 EraseBytes- 00:08:15.251 [2024-07-14 02:56:10.351091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.251 [2024-07-14 02:56:10.351117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.251 [2024-07-14 02:56:10.351151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.251 [2024-07-14 02:56:10.351166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.251 [2024-07-14 02:56:10.351214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.251 [2024-07-14 02:56:10.351229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.251 #81 NEW cov: 11786 ft: 14638 corp: 41/2552b lim: 100 exec/s: 81 rss: 69Mb L: 69/97 MS: 1 ChangeByte- 00:08:15.251 [2024-07-14 02:56:10.391305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.251 [2024-07-14 02:56:10.391331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.251 [2024-07-14 02:56:10.391375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.251 [2024-07-14 02:56:10.391390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.251 [2024-07-14 02:56:10.391440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.251 [2024-07-14 02:56:10.391457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.251 [2024-07-14 02:56:10.391508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:15.251 [2024-07-14 02:56:10.391522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.251 #82 NEW cov: 11786 ft: 14648 corp: 42/2643b lim: 100 exec/s: 82 rss: 69Mb L: 91/97 MS: 1 InsertRepeatedBytes- 00:08:15.251 [2024-07-14 02:56:10.431062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.251 [2024-07-14 02:56:10.431088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.251 #83 NEW cov: 11786 ft: 14675 corp: 43/2677b lim: 100 exec/s: 83 rss: 70Mb L: 34/97 MS: 1 CrossOver- 00:08:15.251 [2024-07-14 02:56:10.471248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.251 [2024-07-14 02:56:10.471275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.251 #84 NEW cov: 11786 ft: 14701 corp: 44/2716b lim: 100 exec/s: 84 rss: 70Mb L: 39/97 MS: 1 InsertByte- 00:08:15.539 [2024-07-14 02:56:10.511675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.539 [2024-07-14 02:56:10.511702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.539 [2024-07-14 02:56:10.511742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.539 [2024-07-14 02:56:10.511757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.539 [2024-07-14 02:56:10.511810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.539 [2024-07-14 02:56:10.511824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.539 [2024-07-14 02:56:10.511880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:15.539 [2024-07-14 02:56:10.511898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.539 #85 NEW cov: 11786 ft: 14708 corp: 45/2813b lim: 100 exec/s: 85 rss: 70Mb L: 97/97 MS: 1 ChangeByte- 00:08:15.539 [2024-07-14 02:56:10.551684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.539 [2024-07-14 02:56:10.551711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.539 [2024-07-14 02:56:10.551747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.539 [2024-07-14 02:56:10.551760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.539 [2024-07-14 02:56:10.551811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.539 [2024-07-14 02:56:10.551825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.539 #86 NEW cov: 11786 ft: 14730 corp: 46/2878b lim: 100 exec/s: 43 rss: 70Mb L: 65/97 MS: 1 CopyPart- 00:08:15.539 #86 DONE cov: 11786 ft: 14730 corp: 46/2878b lim: 100 exec/s: 43 rss: 70Mb 00:08:15.539 Done 86 runs in 2 second(s) 00:08:15.539 02:56:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:15.539 02:56:10 -- ../common.sh@72 -- # (( i++ )) 00:08:15.539 02:56:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.539 02:56:10 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:15.539 02:56:10 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:15.539 02:56:10 -- nvmf/run.sh@24 -- # local timen=1 00:08:15.539 02:56:10 -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.539 02:56:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:15.539 02:56:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:15.539 02:56:10 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:15.539 02:56:10 -- nvmf/run.sh@29 -- # port=4419 00:08:15.539 02:56:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:15.539 02:56:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:15.539 02:56:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.539 02:56:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:15.539 [2024-07-14 02:56:10.730455] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:15.539 [2024-07-14 02:56:10.730523] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid674982 ] 00:08:15.539 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.830 [2024-07-14 02:56:10.905271] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.830 [2024-07-14 02:56:10.925333] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.830 [2024-07-14 02:56:10.925456] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.830 [2024-07-14 02:56:10.976930] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.830 [2024-07-14 02:56:10.993215] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:15.830 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.830 INFO: Seed: 2421403217 00:08:15.830 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:15.830 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:15.830 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:15.830 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.830 #2 INITED exec/s: 0 rss: 59Mb 00:08:15.830 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.830 This may also happen if the target rejected all inputs we tried so far 00:08:15.830 [2024-07-14 02:56:11.059643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:15.830 [2024-07-14 02:56:11.059678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.830 [2024-07-14 02:56:11.059811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:15.830 [2024-07-14 02:56:11.059838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.830 [2024-07-14 02:56:11.059958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:15.830 [2024-07-14 02:56:11.059982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.830 [2024-07-14 02:56:11.060109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 00:08:15.830 [2024-07-14 02:56:11.060128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.348 NEW_FUNC[1/670]: 0x4b1d80 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:16.348 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.348 #10 NEW cov: 11537 ft: 11521 corp: 2/46b lim: 50 exec/s: 0 rss: 66Mb L: 45/45 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:16.348 [2024-07-14 02:56:11.390223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:16.348 [2024-07-14 02:56:11.390275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.348 [2024-07-14 02:56:11.390400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:16.348 [2024-07-14 02:56:11.390432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.348 #11 NEW cov: 11650 ft: 12441 corp: 3/66b lim: 50 exec/s: 0 rss: 66Mb L: 20/45 MS: 1 CrossOver- 00:08:16.348 [2024-07-14 02:56:11.430224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:16.348 [2024-07-14 02:56:11.430252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.348 [2024-07-14 02:56:11.430362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:42409 00:08:16.348 [2024-07-14 02:56:11.430385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.349 #12 NEW cov: 11656 ft: 12758 corp: 4/86b lim: 50 exec/s: 0 rss: 66Mb L: 20/45 MS: 1 ChangeBinInt- 00:08:16.349 [2024-07-14 02:56:11.470841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:16.349 [2024-07-14 02:56:11.470872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.470958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:16.349 [2024-07-14 02:56:11.470987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.471100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:16.349 [2024-07-14 02:56:11.471123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.471238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446648041076162559 len:43177 00:08:16.349 [2024-07-14 02:56:11.471261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.471375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:16.349 [2024-07-14 02:56:11.471404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.349 #13 NEW cov: 11741 ft: 13011 corp: 5/136b lim: 50 exec/s: 0 rss: 66Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:16.349 [2024-07-14 02:56:11.510964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:16.349 [2024-07-14 02:56:11.510996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.511081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:16.349 [2024-07-14 02:56:11.511102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.511211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:16.349 [2024-07-14 02:56:11.511232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.511342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446648041076162559 len:43177 00:08:16.349 [2024-07-14 02:56:11.511364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.511474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:16.349 [2024-07-14 02:56:11.511505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.349 #14 NEW cov: 11741 ft: 13167 corp: 6/186b lim: 50 exec/s: 0 rss: 66Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:16.349 [2024-07-14 02:56:11.550922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:16.349 [2024-07-14 02:56:11.550952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.551023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11576688284493457576 len:43177 00:08:16.349 [2024-07-14 02:56:11.551045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.551150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:16.349 [2024-07-14 02:56:11.551173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.551278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 00:08:16.349 [2024-07-14 02:56:11.551298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.349 #15 NEW cov: 11741 ft: 13216 corp: 7/231b lim: 50 exec/s: 0 rss: 66Mb L: 45/50 MS: 1 ChangeBinInt- 00:08:16.349 [2024-07-14 02:56:11.590749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:16.349 [2024-07-14 02:56:11.590781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.349 [2024-07-14 02:56:11.590890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149033967298728 len:43177 00:08:16.349 [2024-07-14 02:56:11.590911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.609 #16 NEW cov: 11741 ft: 13268 corp: 8/253b lim: 50 exec/s: 0 rss: 66Mb L: 22/50 MS: 1 CMP- DE: "\000\000"- 00:08:16.609 [2024-07-14 02:56:11.631338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:16.609 [2024-07-14 02:56:11.631369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.631446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:16.609 [2024-07-14 02:56:11.631467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.631573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:16.609 [2024-07-14 02:56:11.631593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.631708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:96028355198976 len:43177 00:08:16.609 [2024-07-14 02:56:11.631733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.631846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:16.609 [2024-07-14 02:56:11.631867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.609 #17 NEW cov: 11741 ft: 13332 corp: 9/303b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:16.609 [2024-07-14 02:56:11.670951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036786526376 len:43177 00:08:16.609 [2024-07-14 02:56:11.670982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.671090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:16.609 [2024-07-14 02:56:11.671112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.609 #18 NEW cov: 11741 ft: 13371 corp: 10/323b lim: 50 exec/s: 0 rss: 67Mb L: 20/50 MS: 1 ShuffleBytes- 00:08:16.609 [2024-07-14 02:56:11.711626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:16.609 [2024-07-14 02:56:11.711655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.711735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:16.609 [2024-07-14 02:56:11.711757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.711870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153122648517814440 len:43177 00:08:16.609 [2024-07-14 02:56:11.711892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.712007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446648041076162559 len:43177 00:08:16.609 [2024-07-14 02:56:11.712030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.712147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:16.609 [2024-07-14 02:56:11.712174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.609 #19 NEW cov: 11741 ft: 13391 corp: 11/373b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:08:16.609 [2024-07-14 02:56:11.751641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:16.609 [2024-07-14 02:56:11.751673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.751755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:16.609 [2024-07-14 02:56:11.751777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.751898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:16.609 [2024-07-14 02:56:11.751923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.752042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446648041076162559 len:43177 00:08:16.609 [2024-07-14 02:56:11.752063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.752183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:16.609 [2024-07-14 02:56:11.752204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.609 #20 NEW cov: 11741 ft: 13401 corp: 12/423b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CrossOver- 00:08:16.609 [2024-07-14 02:56:11.791861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:16.609 [2024-07-14 02:56:11.791894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.791976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:16.609 [2024-07-14 02:56:11.792000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.792108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153122648517814440 len:43177 00:08:16.609 [2024-07-14 02:56:11.792129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.792245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744072249933823 len:65449 00:08:16.609 [2024-07-14 02:56:11.792270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.609 [2024-07-14 02:56:11.792373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:16.609 [2024-07-14 02:56:11.792394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.609 #21 NEW cov: 11741 ft: 13443 corp: 13/473b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CrossOver- 00:08:16.610 [2024-07-14 02:56:11.841489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:16.610 [2024-07-14 02:56:11.841522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.610 [2024-07-14 02:56:11.841636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149033967298728 len:43177 00:08:16.610 [2024-07-14 02:56:11.841652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.869 #22 NEW cov: 11741 ft: 13471 corp: 14/495b lim: 50 exec/s: 0 rss: 67Mb L: 22/50 MS: 1 ChangeBit- 00:08:16.869 [2024-07-14 02:56:11.891942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:16.869 [2024-07-14 02:56:11.891971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:11.892063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11576688284493457576 len:43177 00:08:16.869 [2024-07-14 02:56:11.892087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:11.892202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:16.869 [2024-07-14 02:56:11.892227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:11.892343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 00:08:16.869 [2024-07-14 02:56:11.892364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.869 #23 NEW cov: 11741 ft: 13498 corp: 15/541b lim: 50 exec/s: 0 rss: 67Mb L: 46/50 MS: 1 CrossOver- 00:08:16.869 [2024-07-14 02:56:11.931666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:16.869 [2024-07-14 02:56:11.931701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:11.931814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796897448 len:43177 00:08:16.869 [2024-07-14 02:56:11.931849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.869 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.869 #24 NEW cov: 11764 ft: 13534 corp: 16/561b lim: 50 exec/s: 0 rss: 67Mb L: 20/50 MS: 1 ChangeBit- 00:08:16.869 [2024-07-14 02:56:11.981932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:1 00:08:16.869 [2024-07-14 02:56:11.981964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:11.982082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:16764835055224268968 len:43177 00:08:16.869 [2024-07-14 02:56:11.982102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.869 #25 NEW cov: 11764 ft: 13675 corp: 17/583b lim: 50 exec/s: 0 rss: 67Mb L: 22/50 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:16.869 [2024-07-14 02:56:12.022026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036786526376 len:43177 00:08:16.869 [2024-07-14 02:56:12.022059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:12.022196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12152970194358675624 len:43177 00:08:16.869 [2024-07-14 02:56:12.022218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.869 #26 NEW cov: 11764 ft: 13713 corp: 18/605b lim: 50 exec/s: 26 rss: 67Mb L: 22/50 MS: 1 CMP- DE: "\006\000"- 00:08:16.869 [2024-07-14 02:56:12.062687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:16.869 [2024-07-14 02:56:12.062721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:12.062836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:16.869 [2024-07-14 02:56:12.062856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:12.062968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:16.869 [2024-07-14 02:56:12.062991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.869 [2024-07-14 02:56:12.063103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446648041076162559 len:43177 00:08:16.869 [2024-07-14 02:56:12.063129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.870 [2024-07-14 02:56:12.063242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:16.870 [2024-07-14 02:56:12.063267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.870 #27 NEW cov: 11764 ft: 13761 corp: 19/655b lim: 50 exec/s: 27 rss: 67Mb L: 50/50 MS: 1 CopyPart- 00:08:16.870 [2024-07-14 02:56:12.102445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:16.870 [2024-07-14 02:56:12.102478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.870 [2024-07-14 02:56:12.102585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:16.870 [2024-07-14 02:56:12.102608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.870 [2024-07-14 02:56:12.102725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:16.870 [2024-07-14 02:56:12.102752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.129 #28 NEW cov: 11764 ft: 13976 corp: 20/692b lim: 50 exec/s: 28 rss: 67Mb L: 37/50 MS: 1 CopyPart- 00:08:17.129 [2024-07-14 02:56:12.142518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:17.129 [2024-07-14 02:56:12.142546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.129 [2024-07-14 02:56:12.142663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036791224232 len:42409 00:08:17.129 [2024-07-14 02:56:12.142686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.129 #29 NEW cov: 11764 ft: 14004 corp: 21/712b lim: 50 exec/s: 29 rss: 67Mb L: 20/50 MS: 1 ChangeBinInt- 00:08:17.129 [2024-07-14 02:56:12.183048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:17.129 [2024-07-14 02:56:12.183080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.129 [2024-07-14 02:56:12.183148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:17.129 [2024-07-14 02:56:12.183171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.129 [2024-07-14 02:56:12.183296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:17.129 [2024-07-14 02:56:12.183320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.129 [2024-07-14 02:56:12.183432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446648041076162559 len:43177 00:08:17.129 [2024-07-14 02:56:12.183460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.129 [2024-07-14 02:56:12.183587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:17.129 [2024-07-14 02:56:12.183611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.129 #30 NEW cov: 11764 ft: 14031 corp: 22/762b lim: 50 exec/s: 30 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:17.129 [2024-07-14 02:56:12.222779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:17.129 [2024-07-14 02:56:12.222811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.129 [2024-07-14 02:56:12.222890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446743697239900159 len:43177 00:08:17.129 [2024-07-14 02:56:12.222911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.129 [2024-07-14 02:56:12.223021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43019 00:08:17.129 [2024-07-14 02:56:12.223041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.129 #31 NEW cov: 11764 ft: 14045 corp: 23/793b lim: 50 exec/s: 31 rss: 68Mb L: 31/50 MS: 1 EraseBytes- 00:08:17.129 [2024-07-14 02:56:12.263316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:17.129 [2024-07-14 02:56:12.263346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.129 [2024-07-14 02:56:12.263407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:17.129 [2024-07-14 02:56:12.263427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.130 [2024-07-14 02:56:12.263548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153122648517814440 len:43177 00:08:17.130 [2024-07-14 02:56:12.263571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.130 [2024-07-14 02:56:12.263675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446366566099451903 len:43177 00:08:17.130 [2024-07-14 02:56:12.263695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.130 [2024-07-14 02:56:12.263812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:17.130 [2024-07-14 02:56:12.263836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.130 #32 NEW cov: 11764 ft: 14068 corp: 24/843b lim: 50 exec/s: 32 rss: 68Mb L: 50/50 MS: 1 ChangeBit- 00:08:17.130 [2024-07-14 02:56:12.302805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:17.130 [2024-07-14 02:56:12.302838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.130 [2024-07-14 02:56:12.302956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149033967298608 len:43177 00:08:17.130 [2024-07-14 02:56:12.302979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.130 #33 NEW cov: 11764 ft: 14078 corp: 25/865b lim: 50 exec/s: 33 rss: 68Mb L: 22/50 MS: 1 ChangeByte- 00:08:17.130 [2024-07-14 02:56:12.342969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:17.130 [2024-07-14 02:56:12.343000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.130 [2024-07-14 02:56:12.343123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:17.130 [2024-07-14 02:56:12.343143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.130 #38 NEW cov: 11764 ft: 14085 corp: 26/890b lim: 50 exec/s: 38 rss: 68Mb L: 25/50 MS: 5 InsertByte-ChangeBit-PersAutoDict-ShuffleBytes-InsertRepeatedBytes- DE: "\000\000"- 00:08:17.389 [2024-07-14 02:56:12.383081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148312591575208 len:1 00:08:17.389 [2024-07-14 02:56:12.383111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.389 [2024-07-14 02:56:12.383234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:47473235595362304 len:43177 00:08:17.389 [2024-07-14 02:56:12.383255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.389 #39 NEW cov: 11764 ft: 14104 corp: 27/918b lim: 50 exec/s: 39 rss: 68Mb L: 28/50 MS: 1 InsertRepeatedBytes- 00:08:17.389 [2024-07-14 02:56:12.423604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:22529 00:08:17.389 [2024-07-14 02:56:12.423639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.389 [2024-07-14 02:56:12.423769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6293402968804818944 len:43177 00:08:17.389 [2024-07-14 02:56:12.423793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.389 [2024-07-14 02:56:12.423911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12152975313959692456 len:43177 00:08:17.389 [2024-07-14 02:56:12.423933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.389 [2024-07-14 02:56:12.424054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12153149311674788008 len:43177 00:08:17.389 [2024-07-14 02:56:12.424076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.389 #40 NEW cov: 11764 ft: 14120 corp: 28/963b lim: 50 exec/s: 40 rss: 68Mb L: 45/50 MS: 1 CrossOver- 00:08:17.389 [2024-07-14 02:56:12.463829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:17.389 [2024-07-14 02:56:12.463860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.389 [2024-07-14 02:56:12.463934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:17.389 [2024-07-14 02:56:12.463962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.389 [2024-07-14 02:56:12.464077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153122648517814440 len:43177 00:08:17.389 [2024-07-14 02:56:12.464103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.389 [2024-07-14 02:56:12.464219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446366566099451903 len:43177 00:08:17.389 [2024-07-14 02:56:12.464244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.389 [2024-07-14 02:56:12.464358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:17.389 [2024-07-14 02:56:12.464378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.390 #41 NEW cov: 11764 ft: 14214 corp: 29/1013b lim: 50 exec/s: 41 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:17.390 [2024-07-14 02:56:12.504062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:17.390 [2024-07-14 02:56:12.504092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.504162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:17.390 [2024-07-14 02:56:12.504190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.504306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:17.390 [2024-07-14 02:56:12.504329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.504447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 00:08:17.390 [2024-07-14 02:56:12.504467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.504597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:12153149036796881064 len:2571 00:08:17.390 [2024-07-14 02:56:12.504620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.390 #42 NEW cov: 11764 ft: 14231 corp: 30/1063b lim: 50 exec/s: 42 rss: 68Mb L: 50/50 MS: 1 CopyPart- 00:08:17.390 [2024-07-14 02:56:12.543853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:17.390 [2024-07-14 02:56:12.543891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.543998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149035454703784 len:43177 00:08:17.390 [2024-07-14 02:56:12.544019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.544131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:17.390 [2024-07-14 02:56:12.544154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.390 #43 NEW cov: 11764 ft: 14255 corp: 31/1101b lim: 50 exec/s: 43 rss: 68Mb L: 38/50 MS: 1 CrossOver- 00:08:17.390 [2024-07-14 02:56:12.584098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12117401714754627752 len:43177 00:08:17.390 [2024-07-14 02:56:12.584128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.584217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12150897236983195816 len:43177 00:08:17.390 [2024-07-14 02:56:12.584243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.584356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:17.390 [2024-07-14 02:56:12.584387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.390 [2024-07-14 02:56:12.584504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 00:08:17.390 [2024-07-14 02:56:12.584523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.390 #44 NEW cov: 11764 ft: 14256 corp: 32/1147b lim: 50 exec/s: 44 rss: 68Mb L: 46/50 MS: 1 InsertByte- 00:08:17.390 [2024-07-14 02:56:12.623798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:17.390 [2024-07-14 02:56:12.623822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.649 #45 NEW cov: 11764 ft: 14537 corp: 33/1166b lim: 50 exec/s: 45 rss: 68Mb L: 19/50 MS: 1 EraseBytes- 00:08:17.649 [2024-07-14 02:56:12.663919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146097320 len:43177 00:08:17.649 [2024-07-14 02:56:12.663950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.664036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036791224232 len:42409 00:08:17.649 [2024-07-14 02:56:12.664057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.649 #46 NEW cov: 11764 ft: 14554 corp: 34/1186b lim: 50 exec/s: 46 rss: 68Mb L: 20/50 MS: 1 ChangeBit- 00:08:17.649 [2024-07-14 02:56:12.704143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148312591575208 len:1 00:08:17.649 [2024-07-14 02:56:12.704177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.704246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:47473235595362304 len:43177 00:08:17.649 [2024-07-14 02:56:12.704267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.649 #47 NEW cov: 11764 ft: 14570 corp: 35/1214b lim: 50 exec/s: 47 rss: 68Mb L: 28/50 MS: 1 ShuffleBytes- 00:08:17.649 [2024-07-14 02:56:12.744648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:17.649 [2024-07-14 02:56:12.744680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.744764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:17.649 [2024-07-14 02:56:12.744789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.744899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153060727974308008 len:22361 00:08:17.649 [2024-07-14 02:56:12.744918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.745034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 00:08:17.649 [2024-07-14 02:56:12.745057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.649 #53 NEW cov: 11764 ft: 14586 corp: 36/1256b lim: 50 exec/s: 53 rss: 68Mb L: 42/50 MS: 1 CrossOver- 00:08:17.649 [2024-07-14 02:56:12.784724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:17.649 [2024-07-14 02:56:12.784756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.784814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:17.649 [2024-07-14 02:56:12.784835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.784954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:17.649 [2024-07-14 02:56:12.784973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.785084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 00:08:17.649 [2024-07-14 02:56:12.785103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.649 #54 NEW cov: 11764 ft: 14598 corp: 37/1301b lim: 50 exec/s: 54 rss: 68Mb L: 45/50 MS: 1 ShuffleBytes- 00:08:17.649 [2024-07-14 02:56:12.824504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:65536 00:08:17.649 [2024-07-14 02:56:12.824539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.824659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149038262200488 len:43177 00:08:17.649 [2024-07-14 02:56:12.824684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.649 #55 NEW cov: 11764 ft: 14610 corp: 38/1327b lim: 50 exec/s: 55 rss: 69Mb L: 26/50 MS: 1 EraseBytes- 00:08:17.649 [2024-07-14 02:56:12.864683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:17.649 [2024-07-14 02:56:12.864715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.649 [2024-07-14 02:56:12.864830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149311674788008 len:43177 00:08:17.650 [2024-07-14 02:56:12.864849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.650 #56 NEW cov: 11764 ft: 14615 corp: 39/1347b lim: 50 exec/s: 56 rss: 69Mb L: 20/50 MS: 1 ShuffleBytes- 00:08:17.909 [2024-07-14 02:56:12.904870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153148693199497384 len:22360 00:08:17.909 [2024-07-14 02:56:12.904904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.909 [2024-07-14 02:56:12.904984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070902032552 len:65449 00:08:17.909 [2024-07-14 02:56:12.905004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.909 [2024-07-14 02:56:12.905113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 00:08:17.909 [2024-07-14 02:56:12.905134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.909 #57 NEW cov: 11764 ft: 14622 corp: 40/1380b lim: 50 exec/s: 57 rss: 69Mb L: 33/50 MS: 1 EraseBytes- 00:08:17.909 [2024-07-14 02:56:12.944844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036786526376 len:43177 00:08:17.909 [2024-07-14 02:56:12.944873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.909 [2024-07-14 02:56:12.944986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12152970194358675624 len:43177 00:08:17.909 [2024-07-14 02:56:12.945008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.909 #58 NEW cov: 11764 ft: 14710 corp: 41/1402b lim: 50 exec/s: 58 rss: 69Mb L: 22/50 MS: 1 ChangeByte- 00:08:17.909 [2024-07-14 02:56:12.985337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149036796881064 len:43177 00:08:17.909 [2024-07-14 02:56:12.985368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.909 [2024-07-14 02:56:12.985453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 00:08:17.909 [2024-07-14 02:56:12.985478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.909 [2024-07-14 02:56:12.985618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12153060727974308008 len:22361 00:08:17.909 [2024-07-14 02:56:12.985647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.909 [2024-07-14 02:56:12.985762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:8982614899128051880 len:43177 00:08:17.909 [2024-07-14 02:56:12.985782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.909 #59 NEW cov: 11764 ft: 14716 corp: 42/1445b lim: 50 exec/s: 59 rss: 69Mb L: 43/50 MS: 1 InsertByte- 00:08:17.909 [2024-07-14 02:56:13.025109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12153149034146080936 len:43177 00:08:17.909 [2024-07-14 02:56:13.025140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.909 [2024-07-14 02:56:13.025245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12153149036662663336 len:43177 00:08:17.909 [2024-07-14 02:56:13.025270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.909 #60 NEW cov: 11764 ft: 14722 corp: 43/1465b lim: 50 exec/s: 30 rss: 69Mb L: 20/50 MS: 1 CrossOver- 00:08:17.909 #60 DONE cov: 11764 ft: 14722 corp: 43/1465b lim: 50 exec/s: 30 rss: 69Mb 00:08:17.909 ###### Recommended dictionary. ###### 00:08:17.909 "\000\000" # Uses: 2 00:08:17.909 "\006\000" # Uses: 0 00:08:17.909 ###### End of recommended dictionary. ###### 00:08:17.909 Done 60 runs in 2 second(s) 00:08:17.909 02:56:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:17.909 02:56:13 -- ../common.sh@72 -- # (( i++ )) 00:08:17.909 02:56:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.909 02:56:13 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:17.909 02:56:13 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:17.909 02:56:13 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.909 02:56:13 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.909 02:56:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:17.909 02:56:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:18.167 02:56:13 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:18.167 02:56:13 -- nvmf/run.sh@29 -- # port=4420 00:08:18.167 02:56:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:18.167 02:56:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:18.167 02:56:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.167 02:56:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:18.167 [2024-07-14 02:56:13.198215] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:18.167 [2024-07-14 02:56:13.198281] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid675520 ] 00:08:18.167 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.167 [2024-07-14 02:56:13.368876] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.167 [2024-07-14 02:56:13.388413] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:18.167 [2024-07-14 02:56:13.388559] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.425 [2024-07-14 02:56:13.440089] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.425 [2024-07-14 02:56:13.456375] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:18.425 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.425 INFO: Seed: 589425326 00:08:18.425 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:18.425 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:18.425 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:18.425 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.425 #2 INITED exec/s: 0 rss: 59Mb 00:08:18.425 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.425 This may also happen if the target rejected all inputs we tried so far 00:08:18.425 [2024-07-14 02:56:13.501454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.425 [2024-07-14 02:56:13.501485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.683 NEW_FUNC[1/672]: 0x4b3940 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:18.683 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.683 #7 NEW cov: 11595 ft: 11596 corp: 2/20b lim: 90 exec/s: 0 rss: 66Mb L: 19/19 MS: 5 CopyPart-ChangeBit-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:18.683 [2024-07-14 02:56:13.802227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.683 [2024-07-14 02:56:13.802268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.683 #15 NEW cov: 11708 ft: 11989 corp: 3/38b lim: 90 exec/s: 0 rss: 66Mb L: 18/19 MS: 3 InsertByte-CopyPart-InsertRepeatedBytes- 00:08:18.683 [2024-07-14 02:56:13.842218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.683 [2024-07-14 02:56:13.842245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.683 #17 NEW cov: 11714 ft: 12239 corp: 4/73b lim: 90 exec/s: 0 rss: 66Mb L: 35/35 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:18.683 [2024-07-14 02:56:13.882359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.683 [2024-07-14 02:56:13.882387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.683 #18 NEW cov: 11799 ft: 12426 corp: 5/92b lim: 90 exec/s: 0 rss: 66Mb L: 19/35 MS: 1 InsertByte- 00:08:18.683 [2024-07-14 02:56:13.922720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.683 [2024-07-14 02:56:13.922749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.683 [2024-07-14 02:56:13.922788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:18.683 [2024-07-14 02:56:13.922804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.683 [2024-07-14 02:56:13.922854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:18.683 [2024-07-14 02:56:13.922871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.942 #23 NEW cov: 11799 ft: 13357 corp: 6/154b lim: 90 exec/s: 0 rss: 66Mb L: 62/62 MS: 5 ChangeBinInt-CopyPart-ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:08:18.942 [2024-07-14 02:56:13.962973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.942 [2024-07-14 02:56:13.963002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:13.963040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:18.942 [2024-07-14 02:56:13.963057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:13.963108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:18.942 [2024-07-14 02:56:13.963124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:13.963175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:18.942 [2024-07-14 02:56:13.963191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.942 #28 NEW cov: 11799 ft: 13783 corp: 7/231b lim: 90 exec/s: 0 rss: 66Mb L: 77/77 MS: 5 InsertByte-EraseBytes-CrossOver-ChangeBit-InsertRepeatedBytes- 00:08:18.942 [2024-07-14 02:56:14.002943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.942 [2024-07-14 02:56:14.002970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.003015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:18.942 [2024-07-14 02:56:14.003031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.003085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:18.942 [2024-07-14 02:56:14.003102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.942 #29 NEW cov: 11799 ft: 13966 corp: 8/293b lim: 90 exec/s: 0 rss: 66Mb L: 62/77 MS: 1 ShuffleBytes- 00:08:18.942 [2024-07-14 02:56:14.043203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.942 [2024-07-14 02:56:14.043230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.043267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:18.942 [2024-07-14 02:56:14.043283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.043333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:18.942 [2024-07-14 02:56:14.043349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.043401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:18.942 [2024-07-14 02:56:14.043417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.942 #30 NEW cov: 11799 ft: 14035 corp: 9/371b lim: 90 exec/s: 0 rss: 67Mb L: 78/78 MS: 1 InsertByte- 00:08:18.942 [2024-07-14 02:56:14.082911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.942 [2024-07-14 02:56:14.082940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.942 #31 NEW cov: 11799 ft: 14097 corp: 10/389b lim: 90 exec/s: 0 rss: 67Mb L: 18/78 MS: 1 CopyPart- 00:08:18.942 [2024-07-14 02:56:14.113299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.942 [2024-07-14 02:56:14.113327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.113361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:18.942 [2024-07-14 02:56:14.113376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.113425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:18.942 [2024-07-14 02:56:14.113440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.942 #36 NEW cov: 11799 ft: 14139 corp: 11/444b lim: 90 exec/s: 0 rss: 67Mb L: 55/78 MS: 5 ChangeBinInt-ChangeBit-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:18.942 [2024-07-14 02:56:14.153543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.942 [2024-07-14 02:56:14.153572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.153614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:18.942 [2024-07-14 02:56:14.153630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.153687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:18.942 [2024-07-14 02:56:14.153705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.942 [2024-07-14 02:56:14.153756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:18.942 [2024-07-14 02:56:14.153773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.942 #37 NEW cov: 11799 ft: 14233 corp: 12/526b lim: 90 exec/s: 0 rss: 67Mb L: 82/82 MS: 1 CopyPart- 00:08:18.942 [2024-07-14 02:56:14.193246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:18.942 [2024-07-14 02:56:14.193273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.201 #38 NEW cov: 11799 ft: 14344 corp: 13/544b lim: 90 exec/s: 0 rss: 67Mb L: 18/82 MS: 1 ChangeBit- 00:08:19.201 [2024-07-14 02:56:14.223329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.201 [2024-07-14 02:56:14.223357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.201 #39 NEW cov: 11799 ft: 14418 corp: 14/562b lim: 90 exec/s: 0 rss: 67Mb L: 18/82 MS: 1 ChangeByte- 00:08:19.201 [2024-07-14 02:56:14.263889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.201 [2024-07-14 02:56:14.263916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.201 [2024-07-14 02:56:14.263953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.201 [2024-07-14 02:56:14.263968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.201 [2024-07-14 02:56:14.264019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.201 [2024-07-14 02:56:14.264035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.201 [2024-07-14 02:56:14.264087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:19.201 [2024-07-14 02:56:14.264102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.201 #40 NEW cov: 11799 ft: 14436 corp: 15/640b lim: 90 exec/s: 0 rss: 67Mb L: 78/82 MS: 1 ChangeByte- 00:08:19.201 [2024-07-14 02:56:14.303593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.201 [2024-07-14 02:56:14.303620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.201 #44 NEW cov: 11799 ft: 14468 corp: 16/671b lim: 90 exec/s: 0 rss: 67Mb L: 31/82 MS: 4 EraseBytes-InsertByte-ChangeBit-CrossOver- 00:08:19.201 [2024-07-14 02:56:14.343976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.201 [2024-07-14 02:56:14.344005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.201 [2024-07-14 02:56:14.344040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.201 [2024-07-14 02:56:14.344056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.201 [2024-07-14 02:56:14.344108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.201 [2024-07-14 02:56:14.344123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.201 #45 NEW cov: 11799 ft: 14509 corp: 17/731b lim: 90 exec/s: 0 rss: 67Mb L: 60/82 MS: 1 CrossOver- 00:08:19.201 [2024-07-14 02:56:14.384211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.201 [2024-07-14 02:56:14.384239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.201 [2024-07-14 02:56:14.384282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.201 [2024-07-14 02:56:14.384298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.201 [2024-07-14 02:56:14.384349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.201 [2024-07-14 02:56:14.384365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.201 [2024-07-14 02:56:14.384416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:19.201 [2024-07-14 02:56:14.384432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.201 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.201 #46 NEW cov: 11822 ft: 14559 corp: 18/810b lim: 90 exec/s: 0 rss: 68Mb L: 79/82 MS: 1 InsertByte- 00:08:19.201 [2024-07-14 02:56:14.423939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.201 [2024-07-14 02:56:14.423967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.201 #47 NEW cov: 11822 ft: 14651 corp: 19/828b lim: 90 exec/s: 0 rss: 68Mb L: 18/82 MS: 1 ChangeBit- 00:08:19.460 [2024-07-14 02:56:14.464057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.460 [2024-07-14 02:56:14.464085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.460 #48 NEW cov: 11822 ft: 14675 corp: 20/846b lim: 90 exec/s: 0 rss: 68Mb L: 18/82 MS: 1 ChangeBinInt- 00:08:19.460 [2024-07-14 02:56:14.504138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.460 [2024-07-14 02:56:14.504165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.460 #49 NEW cov: 11822 ft: 14762 corp: 21/865b lim: 90 exec/s: 49 rss: 68Mb L: 19/82 MS: 1 CrossOver- 00:08:19.460 [2024-07-14 02:56:14.544532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.460 [2024-07-14 02:56:14.544558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.544596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.460 [2024-07-14 02:56:14.544612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.544664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.460 [2024-07-14 02:56:14.544680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.460 #55 NEW cov: 11822 ft: 14772 corp: 22/925b lim: 90 exec/s: 55 rss: 68Mb L: 60/82 MS: 1 ShuffleBytes- 00:08:19.460 [2024-07-14 02:56:14.584808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.460 [2024-07-14 02:56:14.584837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.584872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.460 [2024-07-14 02:56:14.584889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.584939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.460 [2024-07-14 02:56:14.584956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.585005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:19.460 [2024-07-14 02:56:14.585020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.460 #56 NEW cov: 11822 ft: 14803 corp: 23/1013b lim: 90 exec/s: 56 rss: 68Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:08:19.460 [2024-07-14 02:56:14.624925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.460 [2024-07-14 02:56:14.624954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.624991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.460 [2024-07-14 02:56:14.625007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.625058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.460 [2024-07-14 02:56:14.625075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.625128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:19.460 [2024-07-14 02:56:14.625142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.460 #59 NEW cov: 11822 ft: 14813 corp: 24/1087b lim: 90 exec/s: 59 rss: 68Mb L: 74/88 MS: 3 ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:19.460 [2024-07-14 02:56:14.664891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.460 [2024-07-14 02:56:14.664920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.664961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.460 [2024-07-14 02:56:14.664975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.665028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.460 [2024-07-14 02:56:14.665043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.460 #60 NEW cov: 11822 ft: 14846 corp: 25/1143b lim: 90 exec/s: 60 rss: 68Mb L: 56/88 MS: 1 InsertByte- 00:08:19.460 [2024-07-14 02:56:14.705121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.460 [2024-07-14 02:56:14.705149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.705195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.460 [2024-07-14 02:56:14.705211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.705263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.460 [2024-07-14 02:56:14.705279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.460 [2024-07-14 02:56:14.705329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:19.460 [2024-07-14 02:56:14.705346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.718 #61 NEW cov: 11822 ft: 14869 corp: 26/1218b lim: 90 exec/s: 61 rss: 68Mb L: 75/88 MS: 1 InsertByte- 00:08:19.718 [2024-07-14 02:56:14.755158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.718 [2024-07-14 02:56:14.755186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.755223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.718 [2024-07-14 02:56:14.755239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.755292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.718 [2024-07-14 02:56:14.755308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.718 #62 NEW cov: 11822 ft: 14877 corp: 27/1281b lim: 90 exec/s: 62 rss: 68Mb L: 63/88 MS: 1 InsertByte- 00:08:19.718 [2024-07-14 02:56:14.795093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.718 [2024-07-14 02:56:14.795120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.795161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.718 [2024-07-14 02:56:14.795176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.718 #63 NEW cov: 11822 ft: 15162 corp: 28/1317b lim: 90 exec/s: 63 rss: 68Mb L: 36/88 MS: 1 CopyPart- 00:08:19.718 [2024-07-14 02:56:14.835336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.718 [2024-07-14 02:56:14.835363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.835399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.718 [2024-07-14 02:56:14.835415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.835470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.718 [2024-07-14 02:56:14.835485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.718 #64 NEW cov: 11822 ft: 15196 corp: 29/1386b lim: 90 exec/s: 64 rss: 69Mb L: 69/88 MS: 1 InsertRepeatedBytes- 00:08:19.718 [2024-07-14 02:56:14.875601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.718 [2024-07-14 02:56:14.875628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.875671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.718 [2024-07-14 02:56:14.875686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.875739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.718 [2024-07-14 02:56:14.875755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.875807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:19.718 [2024-07-14 02:56:14.875824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.718 #65 NEW cov: 11822 ft: 15209 corp: 30/1475b lim: 90 exec/s: 65 rss: 69Mb L: 89/89 MS: 1 CopyPart- 00:08:19.718 [2024-07-14 02:56:14.915743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.718 [2024-07-14 02:56:14.915771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.915812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.718 [2024-07-14 02:56:14.915827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.915878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.718 [2024-07-14 02:56:14.915893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.915943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:19.718 [2024-07-14 02:56:14.915956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.718 #66 NEW cov: 11822 ft: 15231 corp: 31/1564b lim: 90 exec/s: 66 rss: 69Mb L: 89/89 MS: 1 InsertByte- 00:08:19.718 [2024-07-14 02:56:14.955720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.718 [2024-07-14 02:56:14.955747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.955785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.718 [2024-07-14 02:56:14.955799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.718 [2024-07-14 02:56:14.955849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.718 [2024-07-14 02:56:14.955864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.978 #67 NEW cov: 11822 ft: 15239 corp: 32/1627b lim: 90 exec/s: 67 rss: 69Mb L: 63/89 MS: 1 ShuffleBytes- 00:08:19.978 [2024-07-14 02:56:14.995863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.978 [2024-07-14 02:56:14.995891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:14.995927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.978 [2024-07-14 02:56:14.995943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:14.995996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.978 [2024-07-14 02:56:14.996012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.978 #68 NEW cov: 11822 ft: 15241 corp: 33/1687b lim: 90 exec/s: 68 rss: 69Mb L: 60/89 MS: 1 ChangeBinInt- 00:08:19.978 [2024-07-14 02:56:15.035965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.978 [2024-07-14 02:56:15.035992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.036029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.978 [2024-07-14 02:56:15.036044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.036097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.978 [2024-07-14 02:56:15.036116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.978 #69 NEW cov: 11822 ft: 15263 corp: 34/1743b lim: 90 exec/s: 69 rss: 69Mb L: 56/89 MS: 1 ChangeBinInt- 00:08:19.978 [2024-07-14 02:56:15.076207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.978 [2024-07-14 02:56:15.076234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.076270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.978 [2024-07-14 02:56:15.076286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.076336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.978 [2024-07-14 02:56:15.076352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.076400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:19.978 [2024-07-14 02:56:15.076415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.978 #70 NEW cov: 11822 ft: 15283 corp: 35/1817b lim: 90 exec/s: 70 rss: 69Mb L: 74/89 MS: 1 ChangeBinInt- 00:08:19.978 [2024-07-14 02:56:15.116082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.978 [2024-07-14 02:56:15.116109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.116163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.978 [2024-07-14 02:56:15.116178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.978 #71 NEW cov: 11822 ft: 15289 corp: 36/1856b lim: 90 exec/s: 71 rss: 69Mb L: 39/89 MS: 1 EraseBytes- 00:08:19.978 [2024-07-14 02:56:15.156318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.978 [2024-07-14 02:56:15.156345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.156381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.978 [2024-07-14 02:56:15.156398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.156452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:19.978 [2024-07-14 02:56:15.156468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.978 #72 NEW cov: 11822 ft: 15299 corp: 37/1916b lim: 90 exec/s: 72 rss: 69Mb L: 60/89 MS: 1 ChangeBit- 00:08:19.978 [2024-07-14 02:56:15.196271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:19.978 [2024-07-14 02:56:15.196299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.978 [2024-07-14 02:56:15.196338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:19.978 [2024-07-14 02:56:15.196353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.978 #73 NEW cov: 11822 ft: 15310 corp: 38/1958b lim: 90 exec/s: 73 rss: 69Mb L: 42/89 MS: 1 EraseBytes- 00:08:20.238 [2024-07-14 02:56:15.236550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.238 [2024-07-14 02:56:15.236581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.236617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.238 [2024-07-14 02:56:15.236631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.236684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.238 [2024-07-14 02:56:15.236700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.238 #74 NEW cov: 11822 ft: 15327 corp: 39/2014b lim: 90 exec/s: 74 rss: 69Mb L: 56/89 MS: 1 ChangeByte- 00:08:20.238 [2024-07-14 02:56:15.276360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.238 [2024-07-14 02:56:15.276387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.238 #75 NEW cov: 11822 ft: 15399 corp: 40/2032b lim: 90 exec/s: 75 rss: 70Mb L: 18/89 MS: 1 ChangeBinInt- 00:08:20.238 [2024-07-14 02:56:15.306619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.238 [2024-07-14 02:56:15.306646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.306698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.238 [2024-07-14 02:56:15.306714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.238 #76 NEW cov: 11822 ft: 15403 corp: 41/2081b lim: 90 exec/s: 76 rss: 70Mb L: 49/89 MS: 1 EraseBytes- 00:08:20.238 [2024-07-14 02:56:15.347001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.238 [2024-07-14 02:56:15.347030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.347067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.238 [2024-07-14 02:56:15.347084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.347138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.238 [2024-07-14 02:56:15.347155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.347207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:20.238 [2024-07-14 02:56:15.347221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.238 #77 NEW cov: 11822 ft: 15449 corp: 42/2156b lim: 90 exec/s: 77 rss: 70Mb L: 75/89 MS: 1 CrossOver- 00:08:20.238 [2024-07-14 02:56:15.386672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.238 [2024-07-14 02:56:15.386700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.238 #78 NEW cov: 11822 ft: 15487 corp: 43/2174b lim: 90 exec/s: 78 rss: 70Mb L: 18/89 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:20.238 [2024-07-14 02:56:15.417032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.238 [2024-07-14 02:56:15.417060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.417095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.238 [2024-07-14 02:56:15.417114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.417168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.238 [2024-07-14 02:56:15.417185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.238 #79 NEW cov: 11822 ft: 15495 corp: 44/2241b lim: 90 exec/s: 79 rss: 70Mb L: 67/89 MS: 1 EraseBytes- 00:08:20.238 [2024-07-14 02:56:15.457250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.238 [2024-07-14 02:56:15.457279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.457317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.238 [2024-07-14 02:56:15.457334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.238 [2024-07-14 02:56:15.457384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.239 [2024-07-14 02:56:15.457400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.239 [2024-07-14 02:56:15.457454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:20.239 [2024-07-14 02:56:15.457470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.239 #80 NEW cov: 11822 ft: 15497 corp: 45/2330b lim: 90 exec/s: 80 rss: 70Mb L: 89/89 MS: 1 CrossOver- 00:08:20.497 [2024-07-14 02:56:15.496987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.497 [2024-07-14 02:56:15.497018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.497 #81 NEW cov: 11822 ft: 15506 corp: 46/2356b lim: 90 exec/s: 40 rss: 70Mb L: 26/89 MS: 1 CMP- DE: "V\302ItM\177\000\000"- 00:08:20.497 #81 DONE cov: 11822 ft: 15506 corp: 46/2356b lim: 90 exec/s: 40 rss: 70Mb 00:08:20.497 ###### Recommended dictionary. ###### 00:08:20.497 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:20.497 "V\302ItM\177\000\000" # Uses: 0 00:08:20.497 ###### End of recommended dictionary. ###### 00:08:20.497 Done 81 runs in 2 second(s) 00:08:20.497 02:56:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:20.497 02:56:15 -- ../common.sh@72 -- # (( i++ )) 00:08:20.497 02:56:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.497 02:56:15 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:20.497 02:56:15 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:20.497 02:56:15 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.497 02:56:15 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.497 02:56:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:20.497 02:56:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:20.497 02:56:15 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:20.497 02:56:15 -- nvmf/run.sh@29 -- # port=4421 00:08:20.497 02:56:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:20.497 02:56:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:20.497 02:56:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.498 02:56:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:20.498 [2024-07-14 02:56:15.676869] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:20.498 [2024-07-14 02:56:15.676941] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid675899 ] 00:08:20.498 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.756 [2024-07-14 02:56:15.852862] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.756 [2024-07-14 02:56:15.873061] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.756 [2024-07-14 02:56:15.873177] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.756 [2024-07-14 02:56:15.924606] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.756 [2024-07-14 02:56:15.940924] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:20.756 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.756 INFO: Seed: 3073425912 00:08:20.756 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:20.756 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:20.756 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:20.756 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.756 #2 INITED exec/s: 0 rss: 59Mb 00:08:20.756 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.756 This may also happen if the target rejected all inputs we tried so far 00:08:21.015 [2024-07-14 02:56:16.010359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.015 [2024-07-14 02:56:16.010394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.273 NEW_FUNC[1/672]: 0x4b6b60 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:21.273 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.273 #17 NEW cov: 11570 ft: 11571 corp: 2/18b lim: 50 exec/s: 0 rss: 66Mb L: 17/17 MS: 5 ChangeBit-InsertByte-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:21.273 [2024-07-14 02:56:16.331254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.273 [2024-07-14 02:56:16.331303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.273 #18 NEW cov: 11683 ft: 12149 corp: 3/35b lim: 50 exec/s: 0 rss: 66Mb L: 17/17 MS: 1 ChangeByte- 00:08:21.273 [2024-07-14 02:56:16.380980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.273 [2024-07-14 02:56:16.381008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.273 #19 NEW cov: 11689 ft: 12512 corp: 4/52b lim: 50 exec/s: 0 rss: 66Mb L: 17/17 MS: 1 ChangeByte- 00:08:21.273 [2024-07-14 02:56:16.431529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.273 [2024-07-14 02:56:16.431555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.273 #20 NEW cov: 11774 ft: 12776 corp: 5/66b lim: 50 exec/s: 0 rss: 66Mb L: 14/17 MS: 1 EraseBytes- 00:08:21.273 [2024-07-14 02:56:16.481619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.273 [2024-07-14 02:56:16.481649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.273 #21 NEW cov: 11774 ft: 12832 corp: 6/84b lim: 50 exec/s: 0 rss: 66Mb L: 18/18 MS: 1 InsertByte- 00:08:21.530 [2024-07-14 02:56:16.531784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.530 [2024-07-14 02:56:16.531815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.530 #22 NEW cov: 11774 ft: 12867 corp: 7/101b lim: 50 exec/s: 0 rss: 66Mb L: 17/18 MS: 1 ShuffleBytes- 00:08:21.530 [2024-07-14 02:56:16.581943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.530 [2024-07-14 02:56:16.581974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.530 #23 NEW cov: 11774 ft: 12919 corp: 8/118b lim: 50 exec/s: 0 rss: 66Mb L: 17/18 MS: 1 ChangeBit- 00:08:21.530 [2024-07-14 02:56:16.632115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.530 [2024-07-14 02:56:16.632139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.530 #24 NEW cov: 11774 ft: 12942 corp: 9/135b lim: 50 exec/s: 0 rss: 66Mb L: 17/18 MS: 1 ChangeBit- 00:08:21.530 [2024-07-14 02:56:16.683042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.530 [2024-07-14 02:56:16.683075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.530 [2024-07-14 02:56:16.683195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:21.530 [2024-07-14 02:56:16.683221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.530 [2024-07-14 02:56:16.683348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:21.530 [2024-07-14 02:56:16.683373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.530 [2024-07-14 02:56:16.683506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:21.530 [2024-07-14 02:56:16.683528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.530 #25 NEW cov: 11774 ft: 13865 corp: 10/175b lim: 50 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:21.530 [2024-07-14 02:56:16.742452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.530 [2024-07-14 02:56:16.742482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.530 #26 NEW cov: 11774 ft: 13896 corp: 11/189b lim: 50 exec/s: 0 rss: 67Mb L: 14/40 MS: 1 EraseBytes- 00:08:21.788 [2024-07-14 02:56:16.792652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.788 [2024-07-14 02:56:16.792677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.788 #27 NEW cov: 11774 ft: 13917 corp: 12/207b lim: 50 exec/s: 0 rss: 67Mb L: 18/40 MS: 1 InsertByte- 00:08:21.788 [2024-07-14 02:56:16.832490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.788 [2024-07-14 02:56:16.832522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.788 [2024-07-14 02:56:16.832666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:21.788 [2024-07-14 02:56:16.832691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.788 #28 NEW cov: 11774 ft: 14248 corp: 13/236b lim: 50 exec/s: 0 rss: 67Mb L: 29/40 MS: 1 CopyPart- 00:08:21.788 [2024-07-14 02:56:16.883659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.788 [2024-07-14 02:56:16.883690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.788 [2024-07-14 02:56:16.883820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:21.788 [2024-07-14 02:56:16.883847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.788 [2024-07-14 02:56:16.883975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:21.788 [2024-07-14 02:56:16.883998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.788 [2024-07-14 02:56:16.884115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:21.788 [2024-07-14 02:56:16.884138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.788 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.788 #29 NEW cov: 11797 ft: 14378 corp: 14/277b lim: 50 exec/s: 0 rss: 67Mb L: 41/41 MS: 1 InsertByte- 00:08:21.788 [2024-07-14 02:56:16.943351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.788 [2024-07-14 02:56:16.943385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.788 [2024-07-14 02:56:16.943519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:21.788 [2024-07-14 02:56:16.943548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.788 #30 NEW cov: 11797 ft: 14401 corp: 15/297b lim: 50 exec/s: 0 rss: 67Mb L: 20/41 MS: 1 CopyPart- 00:08:21.788 [2024-07-14 02:56:16.993495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:21.788 [2024-07-14 02:56:16.993531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.789 [2024-07-14 02:56:16.993665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:21.789 [2024-07-14 02:56:16.993691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.789 #31 NEW cov: 11797 ft: 14467 corp: 16/324b lim: 50 exec/s: 31 rss: 67Mb L: 27/41 MS: 1 InsertRepeatedBytes- 00:08:22.047 [2024-07-14 02:56:17.043476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.047 [2024-07-14 02:56:17.043502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.047 #37 NEW cov: 11797 ft: 14528 corp: 17/341b lim: 50 exec/s: 37 rss: 67Mb L: 17/41 MS: 1 ShuffleBytes- 00:08:22.047 [2024-07-14 02:56:17.093878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.047 [2024-07-14 02:56:17.093904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.047 [2024-07-14 02:56:17.094025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.047 [2024-07-14 02:56:17.094048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.047 #38 NEW cov: 11797 ft: 14542 corp: 18/361b lim: 50 exec/s: 38 rss: 67Mb L: 20/41 MS: 1 EraseBytes- 00:08:22.047 [2024-07-14 02:56:17.143517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.047 [2024-07-14 02:56:17.143552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.047 [2024-07-14 02:56:17.143671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.047 [2024-07-14 02:56:17.143700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.047 #39 NEW cov: 11797 ft: 14549 corp: 19/389b lim: 50 exec/s: 39 rss: 67Mb L: 28/41 MS: 1 CrossOver- 00:08:22.047 [2024-07-14 02:56:17.204411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.047 [2024-07-14 02:56:17.204448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.047 [2024-07-14 02:56:17.204568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.047 [2024-07-14 02:56:17.204590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.047 [2024-07-14 02:56:17.204724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.047 [2024-07-14 02:56:17.204751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.047 #44 NEW cov: 11797 ft: 14818 corp: 20/423b lim: 50 exec/s: 44 rss: 67Mb L: 34/41 MS: 5 InsertByte-ChangeByte-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:22.047 [2024-07-14 02:56:17.254292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.047 [2024-07-14 02:56:17.254324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.047 [2024-07-14 02:56:17.254420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.047 [2024-07-14 02:56:17.254451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.047 #45 NEW cov: 11797 ft: 14859 corp: 21/444b lim: 50 exec/s: 45 rss: 68Mb L: 21/41 MS: 1 CMP- DE: "\377\377\377\013"- 00:08:22.306 [2024-07-14 02:56:17.314303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.306 [2024-07-14 02:56:17.314336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.306 #46 NEW cov: 11797 ft: 14868 corp: 22/461b lim: 50 exec/s: 46 rss: 68Mb L: 17/41 MS: 1 ShuffleBytes- 00:08:22.306 [2024-07-14 02:56:17.365261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.306 [2024-07-14 02:56:17.365296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.306 [2024-07-14 02:56:17.365434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.306 [2024-07-14 02:56:17.365458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.306 [2024-07-14 02:56:17.365579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.306 [2024-07-14 02:56:17.365604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.306 [2024-07-14 02:56:17.365731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:22.306 [2024-07-14 02:56:17.365755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.306 #47 NEW cov: 11797 ft: 14885 corp: 23/502b lim: 50 exec/s: 47 rss: 68Mb L: 41/41 MS: 1 ShuffleBytes- 00:08:22.306 [2024-07-14 02:56:17.414424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.306 [2024-07-14 02:56:17.414463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.306 [2024-07-14 02:56:17.414584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.306 [2024-07-14 02:56:17.414606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.306 #48 NEW cov: 11797 ft: 14886 corp: 24/523b lim: 50 exec/s: 48 rss: 68Mb L: 21/41 MS: 1 PersAutoDict- DE: "\377\377\377\013"- 00:08:22.306 [2024-07-14 02:56:17.465436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.306 [2024-07-14 02:56:17.465475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.306 [2024-07-14 02:56:17.465570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.306 [2024-07-14 02:56:17.465595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.306 [2024-07-14 02:56:17.465718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.306 [2024-07-14 02:56:17.465740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.306 [2024-07-14 02:56:17.465862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:22.306 [2024-07-14 02:56:17.465878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.306 #49 NEW cov: 11797 ft: 14961 corp: 25/567b lim: 50 exec/s: 49 rss: 68Mb L: 44/44 MS: 1 CrossOver- 00:08:22.307 [2024-07-14 02:56:17.515520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.307 [2024-07-14 02:56:17.515550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.307 [2024-07-14 02:56:17.515629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.307 [2024-07-14 02:56:17.515649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.307 [2024-07-14 02:56:17.515773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.307 [2024-07-14 02:56:17.515794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.307 [2024-07-14 02:56:17.515920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:22.307 [2024-07-14 02:56:17.515946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.307 #50 NEW cov: 11797 ft: 15019 corp: 26/608b lim: 50 exec/s: 50 rss: 68Mb L: 41/44 MS: 1 ChangeBit- 00:08:22.565 [2024-07-14 02:56:17.575817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.565 [2024-07-14 02:56:17.575848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.575935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.565 [2024-07-14 02:56:17.575955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.576076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.565 [2024-07-14 02:56:17.576098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.576224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:22.565 [2024-07-14 02:56:17.576248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.565 #51 NEW cov: 11797 ft: 15028 corp: 27/652b lim: 50 exec/s: 51 rss: 68Mb L: 44/44 MS: 1 ChangeBit- 00:08:22.565 [2024-07-14 02:56:17.635471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.565 [2024-07-14 02:56:17.635503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.635638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.565 [2024-07-14 02:56:17.635659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.565 #52 NEW cov: 11797 ft: 15039 corp: 28/681b lim: 50 exec/s: 52 rss: 68Mb L: 29/44 MS: 1 CopyPart- 00:08:22.565 [2024-07-14 02:56:17.686152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.565 [2024-07-14 02:56:17.686183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.686298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.565 [2024-07-14 02:56:17.686318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.686447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.565 [2024-07-14 02:56:17.686468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.686594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:22.565 [2024-07-14 02:56:17.686616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.565 #53 NEW cov: 11797 ft: 15104 corp: 29/725b lim: 50 exec/s: 53 rss: 68Mb L: 44/44 MS: 1 ChangeBinInt- 00:08:22.565 [2024-07-14 02:56:17.736234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.565 [2024-07-14 02:56:17.736266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.736353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.565 [2024-07-14 02:56:17.736373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.736492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.565 [2024-07-14 02:56:17.736511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.565 [2024-07-14 02:56:17.736638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:22.565 [2024-07-14 02:56:17.736664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.565 #59 NEW cov: 11797 ft: 15156 corp: 30/770b lim: 50 exec/s: 59 rss: 68Mb L: 45/45 MS: 1 InsertByte- 00:08:22.566 [2024-07-14 02:56:17.785720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.566 [2024-07-14 02:56:17.785745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.566 #60 NEW cov: 11797 ft: 15166 corp: 31/788b lim: 50 exec/s: 60 rss: 68Mb L: 18/45 MS: 1 InsertByte- 00:08:22.824 [2024-07-14 02:56:17.835855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.824 [2024-07-14 02:56:17.835879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.824 #61 NEW cov: 11797 ft: 15205 corp: 32/805b lim: 50 exec/s: 61 rss: 68Mb L: 17/45 MS: 1 CrossOver- 00:08:22.824 [2024-07-14 02:56:17.886568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.824 [2024-07-14 02:56:17.886600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.824 [2024-07-14 02:56:17.886737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.824 [2024-07-14 02:56:17.886763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.824 [2024-07-14 02:56:17.886889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.824 [2024-07-14 02:56:17.886910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.824 #62 NEW cov: 11797 ft: 15217 corp: 33/842b lim: 50 exec/s: 62 rss: 68Mb L: 37/45 MS: 1 CrossOver- 00:08:22.824 [2024-07-14 02:56:17.936674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.824 [2024-07-14 02:56:17.936706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.824 [2024-07-14 02:56:17.936836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.824 [2024-07-14 02:56:17.936861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.824 #68 NEW cov: 11797 ft: 15320 corp: 34/871b lim: 50 exec/s: 68 rss: 68Mb L: 29/45 MS: 1 ChangeByte- 00:08:22.824 [2024-07-14 02:56:17.987145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.824 [2024-07-14 02:56:17.987178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.824 [2024-07-14 02:56:17.987324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.824 [2024-07-14 02:56:17.987352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.824 [2024-07-14 02:56:17.987472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.824 [2024-07-14 02:56:17.987493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.824 [2024-07-14 02:56:17.987625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:22.824 [2024-07-14 02:56:17.987646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.824 #69 NEW cov: 11797 ft: 15342 corp: 35/911b lim: 50 exec/s: 34 rss: 68Mb L: 40/45 MS: 1 InsertRepeatedBytes- 00:08:22.824 #69 DONE cov: 11797 ft: 15342 corp: 35/911b lim: 50 exec/s: 34 rss: 68Mb 00:08:22.824 ###### Recommended dictionary. ###### 00:08:22.824 "\377\377\377\013" # Uses: 3 00:08:22.824 ###### End of recommended dictionary. ###### 00:08:22.824 Done 69 runs in 2 second(s) 00:08:23.083 02:56:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:23.083 02:56:18 -- ../common.sh@72 -- # (( i++ )) 00:08:23.083 02:56:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.083 02:56:18 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:23.083 02:56:18 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:23.083 02:56:18 -- nvmf/run.sh@24 -- # local timen=1 00:08:23.083 02:56:18 -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.083 02:56:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:23.083 02:56:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:23.083 02:56:18 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:23.083 02:56:18 -- nvmf/run.sh@29 -- # port=4422 00:08:23.083 02:56:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:23.083 02:56:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:23.083 02:56:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.083 02:56:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:23.083 [2024-07-14 02:56:18.164517] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:23.083 [2024-07-14 02:56:18.164600] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid676354 ] 00:08:23.083 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.342 [2024-07-14 02:56:18.345333] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.342 [2024-07-14 02:56:18.364771] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:23.342 [2024-07-14 02:56:18.364884] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.342 [2024-07-14 02:56:18.416281] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.342 [2024-07-14 02:56:18.432580] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:23.342 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.342 INFO: Seed: 1270441357 00:08:23.342 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:23.342 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:23.342 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:23.342 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.342 #2 INITED exec/s: 0 rss: 60Mb 00:08:23.342 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.342 This may also happen if the target rejected all inputs we tried so far 00:08:23.342 [2024-07-14 02:56:18.498648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:23.342 [2024-07-14 02:56:18.498682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.342 [2024-07-14 02:56:18.498781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:23.342 [2024-07-14 02:56:18.498803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.601 NEW_FUNC[1/672]: 0x4b8e20 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:23.601 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.601 #28 NEW cov: 11596 ft: 11595 corp: 2/44b lim: 85 exec/s: 0 rss: 66Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:08:23.601 [2024-07-14 02:56:18.829707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:23.601 [2024-07-14 02:56:18.829766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.601 [2024-07-14 02:56:18.829901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:23.601 [2024-07-14 02:56:18.829932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.860 #29 NEW cov: 11709 ft: 12148 corp: 3/92b lim: 85 exec/s: 0 rss: 66Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:23.860 [2024-07-14 02:56:18.879560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:23.860 [2024-07-14 02:56:18.879588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.860 #30 NEW cov: 11715 ft: 13224 corp: 4/118b lim: 85 exec/s: 0 rss: 66Mb L: 26/48 MS: 1 CrossOver- 00:08:23.860 [2024-07-14 02:56:18.919772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:23.860 [2024-07-14 02:56:18.919812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.860 [2024-07-14 02:56:18.919930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:23.860 [2024-07-14 02:56:18.919949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.860 #31 NEW cov: 11800 ft: 13479 corp: 5/163b lim: 85 exec/s: 0 rss: 66Mb L: 45/48 MS: 1 CopyPart- 00:08:23.860 [2024-07-14 02:56:18.960005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:23.860 [2024-07-14 02:56:18.960030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.860 [2024-07-14 02:56:18.960151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:23.860 [2024-07-14 02:56:18.960175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.860 #32 NEW cov: 11800 ft: 13585 corp: 6/206b lim: 85 exec/s: 0 rss: 66Mb L: 43/48 MS: 1 ShuffleBytes- 00:08:23.860 [2024-07-14 02:56:18.999681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:23.860 [2024-07-14 02:56:18.999709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.860 #33 NEW cov: 11800 ft: 13679 corp: 7/232b lim: 85 exec/s: 0 rss: 67Mb L: 26/48 MS: 1 ChangeBinInt- 00:08:23.860 [2024-07-14 02:56:19.040149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:23.860 [2024-07-14 02:56:19.040174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.860 [2024-07-14 02:56:19.040291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:23.860 [2024-07-14 02:56:19.040312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.860 #34 NEW cov: 11800 ft: 13755 corp: 8/280b lim: 85 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 ChangeBit- 00:08:23.860 [2024-07-14 02:56:19.080543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:23.860 [2024-07-14 02:56:19.080576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.860 [2024-07-14 02:56:19.080706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:23.860 [2024-07-14 02:56:19.080726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.860 [2024-07-14 02:56:19.080846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:23.860 [2024-07-14 02:56:19.080870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.860 #35 NEW cov: 11800 ft: 14108 corp: 9/342b lim: 85 exec/s: 0 rss: 67Mb L: 62/62 MS: 1 InsertRepeatedBytes- 00:08:24.119 [2024-07-14 02:56:19.120472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.119 [2024-07-14 02:56:19.120499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.119 [2024-07-14 02:56:19.120614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.119 [2024-07-14 02:56:19.120634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.119 #36 NEW cov: 11800 ft: 14151 corp: 10/387b lim: 85 exec/s: 0 rss: 67Mb L: 45/62 MS: 1 ChangeByte- 00:08:24.119 [2024-07-14 02:56:19.160297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.119 [2024-07-14 02:56:19.160322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.119 #37 NEW cov: 11800 ft: 14170 corp: 11/413b lim: 85 exec/s: 0 rss: 67Mb L: 26/62 MS: 1 ChangeByte- 00:08:24.119 [2024-07-14 02:56:19.200604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.119 [2024-07-14 02:56:19.200641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.119 [2024-07-14 02:56:19.200781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.119 [2024-07-14 02:56:19.200798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.119 [2024-07-14 02:56:19.200917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:24.119 [2024-07-14 02:56:19.200941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.119 #38 NEW cov: 11800 ft: 14194 corp: 12/469b lim: 85 exec/s: 0 rss: 67Mb L: 56/62 MS: 1 CrossOver- 00:08:24.119 [2024-07-14 02:56:19.240805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.119 [2024-07-14 02:56:19.240837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.119 [2024-07-14 02:56:19.240962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.119 [2024-07-14 02:56:19.240985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.119 #39 NEW cov: 11800 ft: 14220 corp: 13/512b lim: 85 exec/s: 0 rss: 67Mb L: 43/62 MS: 1 ShuffleBytes- 00:08:24.119 [2024-07-14 02:56:19.280924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.119 [2024-07-14 02:56:19.280958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.119 [2024-07-14 02:56:19.281096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.119 [2024-07-14 02:56:19.281118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.119 #40 NEW cov: 11800 ft: 14272 corp: 14/555b lim: 85 exec/s: 0 rss: 67Mb L: 43/62 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:24.119 [2024-07-14 02:56:19.321305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.119 [2024-07-14 02:56:19.321334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.119 [2024-07-14 02:56:19.321449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.119 [2024-07-14 02:56:19.321471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.119 [2024-07-14 02:56:19.321592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:24.119 [2024-07-14 02:56:19.321613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.119 #41 NEW cov: 11800 ft: 14290 corp: 15/611b lim: 85 exec/s: 0 rss: 67Mb L: 56/62 MS: 1 InsertRepeatedBytes- 00:08:24.119 [2024-07-14 02:56:19.361090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.119 [2024-07-14 02:56:19.361125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.119 [2024-07-14 02:56:19.361269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.119 [2024-07-14 02:56:19.361290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.378 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.378 #42 NEW cov: 11823 ft: 14352 corp: 16/646b lim: 85 exec/s: 0 rss: 68Mb L: 35/62 MS: 1 EraseBytes- 00:08:24.378 [2024-07-14 02:56:19.411263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.379 [2024-07-14 02:56:19.411296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.379 [2024-07-14 02:56:19.411411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.379 [2024-07-14 02:56:19.411436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.379 #43 NEW cov: 11823 ft: 14366 corp: 17/689b lim: 85 exec/s: 0 rss: 68Mb L: 43/62 MS: 1 ChangeBit- 00:08:24.379 [2024-07-14 02:56:19.451074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.379 [2024-07-14 02:56:19.451104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.379 [2024-07-14 02:56:19.451222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.379 [2024-07-14 02:56:19.451244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.379 #44 NEW cov: 11823 ft: 14469 corp: 18/732b lim: 85 exec/s: 44 rss: 68Mb L: 43/62 MS: 1 ChangeBinInt- 00:08:24.379 [2024-07-14 02:56:19.491533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.379 [2024-07-14 02:56:19.491566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.379 [2024-07-14 02:56:19.491696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.379 [2024-07-14 02:56:19.491713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.379 #45 NEW cov: 11823 ft: 14514 corp: 19/777b lim: 85 exec/s: 45 rss: 68Mb L: 45/62 MS: 1 EraseBytes- 00:08:24.379 [2024-07-14 02:56:19.531987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.379 [2024-07-14 02:56:19.532020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.379 [2024-07-14 02:56:19.532107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.379 [2024-07-14 02:56:19.532133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.379 [2024-07-14 02:56:19.532252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:24.379 [2024-07-14 02:56:19.532276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.379 [2024-07-14 02:56:19.532401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:24.379 [2024-07-14 02:56:19.532424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.379 #46 NEW cov: 11823 ft: 14875 corp: 20/861b lim: 85 exec/s: 46 rss: 68Mb L: 84/84 MS: 1 CrossOver- 00:08:24.379 [2024-07-14 02:56:19.571461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.379 [2024-07-14 02:56:19.571496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.379 #47 NEW cov: 11823 ft: 14883 corp: 21/891b lim: 85 exec/s: 47 rss: 68Mb L: 30/84 MS: 1 EraseBytes- 00:08:24.379 [2024-07-14 02:56:19.611837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.379 [2024-07-14 02:56:19.611863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.379 [2024-07-14 02:56:19.612003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.379 [2024-07-14 02:56:19.612028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.638 #48 NEW cov: 11823 ft: 14897 corp: 22/934b lim: 85 exec/s: 48 rss: 68Mb L: 43/84 MS: 1 ChangeByte- 00:08:24.638 [2024-07-14 02:56:19.651977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.638 [2024-07-14 02:56:19.652002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.638 [2024-07-14 02:56:19.652120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.638 [2024-07-14 02:56:19.652141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.638 #49 NEW cov: 11823 ft: 14921 corp: 23/979b lim: 85 exec/s: 49 rss: 68Mb L: 45/84 MS: 1 CrossOver- 00:08:24.638 [2024-07-14 02:56:19.692145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.638 [2024-07-14 02:56:19.692170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.638 [2024-07-14 02:56:19.692287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.638 [2024-07-14 02:56:19.692306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.638 #50 NEW cov: 11823 ft: 14930 corp: 24/1028b lim: 85 exec/s: 50 rss: 68Mb L: 49/84 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:24.638 [2024-07-14 02:56:19.732462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.638 [2024-07-14 02:56:19.732494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.638 [2024-07-14 02:56:19.732595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.638 [2024-07-14 02:56:19.732621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.638 [2024-07-14 02:56:19.732749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:24.638 [2024-07-14 02:56:19.732774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.638 #51 NEW cov: 11823 ft: 14955 corp: 25/1092b lim: 85 exec/s: 51 rss: 68Mb L: 64/84 MS: 1 InsertRepeatedBytes- 00:08:24.638 [2024-07-14 02:56:19.782463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.638 [2024-07-14 02:56:19.782511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.638 [2024-07-14 02:56:19.782608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.638 [2024-07-14 02:56:19.782628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.638 #52 NEW cov: 11823 ft: 14966 corp: 26/1138b lim: 85 exec/s: 52 rss: 68Mb L: 46/84 MS: 1 InsertRepeatedBytes- 00:08:24.638 [2024-07-14 02:56:19.822518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.638 [2024-07-14 02:56:19.822549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.638 [2024-07-14 02:56:19.822673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.638 [2024-07-14 02:56:19.822694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.638 #53 NEW cov: 11823 ft: 14986 corp: 27/1182b lim: 85 exec/s: 53 rss: 68Mb L: 44/84 MS: 1 InsertByte- 00:08:24.638 [2024-07-14 02:56:19.862337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.638 [2024-07-14 02:56:19.862363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.638 [2024-07-14 02:56:19.862496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.638 [2024-07-14 02:56:19.862536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.638 #54 NEW cov: 11823 ft: 14996 corp: 28/1230b lim: 85 exec/s: 54 rss: 68Mb L: 48/84 MS: 1 CrossOver- 00:08:24.898 [2024-07-14 02:56:19.903021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.898 [2024-07-14 02:56:19.903053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.898 [2024-07-14 02:56:19.903165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.898 [2024-07-14 02:56:19.903187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.898 [2024-07-14 02:56:19.903307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:24.898 [2024-07-14 02:56:19.903330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.898 #55 NEW cov: 11823 ft: 15065 corp: 29/1292b lim: 85 exec/s: 55 rss: 68Mb L: 62/84 MS: 1 CMP- DE: "\000)\355\253\032\207\343\204"- 00:08:24.898 [2024-07-14 02:56:19.942639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.898 [2024-07-14 02:56:19.942668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.898 #56 NEW cov: 11823 ft: 15144 corp: 30/1318b lim: 85 exec/s: 56 rss: 68Mb L: 26/84 MS: 1 ChangeBit- 00:08:24.898 [2024-07-14 02:56:19.982909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.898 [2024-07-14 02:56:19.982944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.898 [2024-07-14 02:56:19.983072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:24.898 [2024-07-14 02:56:19.983094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.898 #57 NEW cov: 11823 ft: 15149 corp: 31/1361b lim: 85 exec/s: 57 rss: 68Mb L: 43/84 MS: 1 ShuffleBytes- 00:08:24.898 [2024-07-14 02:56:20.032984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.898 [2024-07-14 02:56:20.033012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.898 #58 NEW cov: 11823 ft: 15295 corp: 32/1387b lim: 85 exec/s: 58 rss: 69Mb L: 26/84 MS: 1 ChangeBinInt- 00:08:24.898 [2024-07-14 02:56:20.083170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.898 [2024-07-14 02:56:20.083204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.898 #59 NEW cov: 11823 ft: 15323 corp: 33/1417b lim: 85 exec/s: 59 rss: 69Mb L: 30/84 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:24.898 [2024-07-14 02:56:20.132762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:24.898 [2024-07-14 02:56:20.132791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.157 #60 NEW cov: 11823 ft: 15369 corp: 34/1449b lim: 85 exec/s: 60 rss: 69Mb L: 32/84 MS: 1 EraseBytes- 00:08:25.157 [2024-07-14 02:56:20.183309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.157 [2024-07-14 02:56:20.183336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.157 #61 NEW cov: 11823 ft: 15444 corp: 35/1475b lim: 85 exec/s: 61 rss: 69Mb L: 26/84 MS: 1 PersAutoDict- DE: "\000)\355\253\032\207\343\204"- 00:08:25.157 [2024-07-14 02:56:20.233776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.157 [2024-07-14 02:56:20.233809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.157 [2024-07-14 02:56:20.233936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:25.157 [2024-07-14 02:56:20.233956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.157 #62 NEW cov: 11823 ft: 15454 corp: 36/1520b lim: 85 exec/s: 62 rss: 69Mb L: 45/84 MS: 1 ChangeByte- 00:08:25.157 [2024-07-14 02:56:20.283679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.157 [2024-07-14 02:56:20.283707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.157 #63 NEW cov: 11823 ft: 15467 corp: 37/1546b lim: 85 exec/s: 63 rss: 69Mb L: 26/84 MS: 1 CopyPart- 00:08:25.157 [2024-07-14 02:56:20.333849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.157 [2024-07-14 02:56:20.333878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.157 #64 NEW cov: 11823 ft: 15473 corp: 38/1569b lim: 85 exec/s: 64 rss: 69Mb L: 23/84 MS: 1 EraseBytes- 00:08:25.157 [2024-07-14 02:56:20.384148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.158 [2024-07-14 02:56:20.384182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.158 [2024-07-14 02:56:20.384274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:25.158 [2024-07-14 02:56:20.384296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.417 #65 NEW cov: 11823 ft: 15548 corp: 39/1613b lim: 85 exec/s: 65 rss: 69Mb L: 44/84 MS: 1 ChangeBit- 00:08:25.417 [2024-07-14 02:56:20.434377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.417 [2024-07-14 02:56:20.434412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.417 [2024-07-14 02:56:20.434535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:25.417 [2024-07-14 02:56:20.434560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.417 #71 NEW cov: 11823 ft: 15554 corp: 40/1658b lim: 85 exec/s: 71 rss: 69Mb L: 45/84 MS: 1 ChangeByte- 00:08:25.417 [2024-07-14 02:56:20.474498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.417 [2024-07-14 02:56:20.474526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.417 [2024-07-14 02:56:20.474638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:25.417 [2024-07-14 02:56:20.474658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.417 #72 NEW cov: 11823 ft: 15603 corp: 41/1701b lim: 85 exec/s: 36 rss: 69Mb L: 43/84 MS: 1 EraseBytes- 00:08:25.417 #72 DONE cov: 11823 ft: 15603 corp: 41/1701b lim: 85 exec/s: 36 rss: 69Mb 00:08:25.417 ###### Recommended dictionary. ###### 00:08:25.417 "\000\000\000\000" # Uses: 1 00:08:25.417 "\000)\355\253\032\207\343\204" # Uses: 1 00:08:25.417 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:25.417 ###### End of recommended dictionary. ###### 00:08:25.417 Done 72 runs in 2 second(s) 00:08:25.417 02:56:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:25.417 02:56:20 -- ../common.sh@72 -- # (( i++ )) 00:08:25.417 02:56:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.417 02:56:20 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:25.417 02:56:20 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:25.417 02:56:20 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.417 02:56:20 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.417 02:56:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:25.417 02:56:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:25.417 02:56:20 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:25.417 02:56:20 -- nvmf/run.sh@29 -- # port=4423 00:08:25.418 02:56:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:25.418 02:56:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:25.418 02:56:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.418 02:56:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:25.418 [2024-07-14 02:56:20.650226] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:25.418 [2024-07-14 02:56:20.650313] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid676897 ] 00:08:25.677 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.677 [2024-07-14 02:56:20.826538] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.677 [2024-07-14 02:56:20.845902] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.677 [2024-07-14 02:56:20.846017] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.677 [2024-07-14 02:56:20.897309] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.677 [2024-07-14 02:56:20.913570] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:25.936 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.936 INFO: Seed: 3753455275 00:08:25.936 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:25.936 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:25.936 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:25.936 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.936 #2 INITED exec/s: 0 rss: 60Mb 00:08:25.936 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.936 This may also happen if the target rejected all inputs we tried so far 00:08:25.936 [2024-07-14 02:56:20.968796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.936 [2024-07-14 02:56:20.968826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.936 [2024-07-14 02:56:20.968888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.936 [2024-07-14 02:56:20.968903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.195 NEW_FUNC[1/671]: 0x4bc050 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:26.196 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.196 #3 NEW cov: 11529 ft: 11530 corp: 2/13b lim: 25 exec/s: 0 rss: 66Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:08:26.196 [2024-07-14 02:56:21.279648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.196 [2024-07-14 02:56:21.279683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.196 [2024-07-14 02:56:21.279741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.196 [2024-07-14 02:56:21.279757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.196 #9 NEW cov: 11642 ft: 11948 corp: 3/25b lim: 25 exec/s: 0 rss: 66Mb L: 12/12 MS: 1 ChangeBinInt- 00:08:26.196 [2024-07-14 02:56:21.319684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.196 [2024-07-14 02:56:21.319715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.196 [2024-07-14 02:56:21.319767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.196 [2024-07-14 02:56:21.319783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.196 #10 NEW cov: 11648 ft: 12368 corp: 4/37b lim: 25 exec/s: 0 rss: 66Mb L: 12/12 MS: 1 ChangeBinInt- 00:08:26.196 [2024-07-14 02:56:21.359840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.196 [2024-07-14 02:56:21.359868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.196 [2024-07-14 02:56:21.359927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.196 [2024-07-14 02:56:21.359943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.196 #11 NEW cov: 11733 ft: 12638 corp: 5/49b lim: 25 exec/s: 0 rss: 67Mb L: 12/12 MS: 1 ShuffleBytes- 00:08:26.196 [2024-07-14 02:56:21.400148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.196 [2024-07-14 02:56:21.400177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.196 [2024-07-14 02:56:21.400216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.196 [2024-07-14 02:56:21.400232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.196 [2024-07-14 02:56:21.400289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:26.196 [2024-07-14 02:56:21.400305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.196 [2024-07-14 02:56:21.400365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:26.196 [2024-07-14 02:56:21.400379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.196 #12 NEW cov: 11733 ft: 13252 corp: 6/73b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:26.196 [2024-07-14 02:56:21.440066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.196 [2024-07-14 02:56:21.440095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.196 [2024-07-14 02:56:21.440153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.196 [2024-07-14 02:56:21.440170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.456 #13 NEW cov: 11733 ft: 13341 corp: 7/85b lim: 25 exec/s: 0 rss: 67Mb L: 12/24 MS: 1 ChangeBinInt- 00:08:26.456 [2024-07-14 02:56:21.480174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.456 [2024-07-14 02:56:21.480203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.480250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.456 [2024-07-14 02:56:21.480266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.456 #14 NEW cov: 11733 ft: 13440 corp: 8/97b lim: 25 exec/s: 0 rss: 67Mb L: 12/24 MS: 1 ShuffleBytes- 00:08:26.456 [2024-07-14 02:56:21.520660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.456 [2024-07-14 02:56:21.520692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.520741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.456 [2024-07-14 02:56:21.520761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.520821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:26.456 [2024-07-14 02:56:21.520840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.520898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:26.456 [2024-07-14 02:56:21.520917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.520979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:26.456 [2024-07-14 02:56:21.520996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:26.456 #15 NEW cov: 11733 ft: 13506 corp: 9/122b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CopyPart- 00:08:26.456 [2024-07-14 02:56:21.560365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.456 [2024-07-14 02:56:21.560394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.560462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.456 [2024-07-14 02:56:21.560481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.456 #16 NEW cov: 11733 ft: 13520 corp: 10/134b lim: 25 exec/s: 0 rss: 67Mb L: 12/25 MS: 1 CMP- DE: "\377\377"- 00:08:26.456 [2024-07-14 02:56:21.590519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.456 [2024-07-14 02:56:21.590548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.590596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.456 [2024-07-14 02:56:21.590614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.456 #17 NEW cov: 11733 ft: 13618 corp: 11/146b lim: 25 exec/s: 0 rss: 67Mb L: 12/25 MS: 1 ChangeByte- 00:08:26.456 [2024-07-14 02:56:21.630572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.456 [2024-07-14 02:56:21.630601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.630657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.456 [2024-07-14 02:56:21.630674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.456 #18 NEW cov: 11733 ft: 13640 corp: 12/158b lim: 25 exec/s: 0 rss: 67Mb L: 12/25 MS: 1 ShuffleBytes- 00:08:26.456 [2024-07-14 02:56:21.670700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.456 [2024-07-14 02:56:21.670728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.456 [2024-07-14 02:56:21.670784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.456 [2024-07-14 02:56:21.670800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.456 #23 NEW cov: 11733 ft: 13711 corp: 13/168b lim: 25 exec/s: 0 rss: 67Mb L: 10/25 MS: 5 ShuffleBytes-ChangeByte-ChangeBit-InsertByte-CMP- DE: "\376\003\000\000\000\000\000\000"- 00:08:26.716 [2024-07-14 02:56:21.710856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.716 [2024-07-14 02:56:21.710885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.710942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.716 [2024-07-14 02:56:21.710959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.716 #24 NEW cov: 11733 ft: 13788 corp: 14/181b lim: 25 exec/s: 0 rss: 67Mb L: 13/25 MS: 1 InsertByte- 00:08:26.716 [2024-07-14 02:56:21.750936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.716 [2024-07-14 02:56:21.750965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.751021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.716 [2024-07-14 02:56:21.751038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.716 #25 NEW cov: 11733 ft: 13828 corp: 15/191b lim: 25 exec/s: 0 rss: 67Mb L: 10/25 MS: 1 ChangeByte- 00:08:26.716 [2024-07-14 02:56:21.791328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.716 [2024-07-14 02:56:21.791357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.791397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.716 [2024-07-14 02:56:21.791414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.791471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:26.716 [2024-07-14 02:56:21.791489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.791548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:26.716 [2024-07-14 02:56:21.791569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.716 #26 NEW cov: 11733 ft: 13861 corp: 16/215b lim: 25 exec/s: 0 rss: 68Mb L: 24/25 MS: 1 ShuffleBytes- 00:08:26.716 [2024-07-14 02:56:21.831457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.716 [2024-07-14 02:56:21.831486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.831523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.716 [2024-07-14 02:56:21.831539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.831598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:26.716 [2024-07-14 02:56:21.831613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.831673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:26.716 [2024-07-14 02:56:21.831689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.716 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.716 #27 NEW cov: 11756 ft: 13963 corp: 17/239b lim: 25 exec/s: 0 rss: 68Mb L: 24/25 MS: 1 CopyPart- 00:08:26.716 [2024-07-14 02:56:21.871289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.716 [2024-07-14 02:56:21.871317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.871366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.716 [2024-07-14 02:56:21.871383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.716 #28 NEW cov: 11756 ft: 13979 corp: 18/251b lim: 25 exec/s: 0 rss: 68Mb L: 12/25 MS: 1 ChangeBit- 00:08:26.716 [2024-07-14 02:56:21.901365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.716 [2024-07-14 02:56:21.901394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.901457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.716 [2024-07-14 02:56:21.901474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.716 #29 NEW cov: 11756 ft: 13997 corp: 19/261b lim: 25 exec/s: 0 rss: 68Mb L: 10/25 MS: 1 ChangeBit- 00:08:26.716 [2024-07-14 02:56:21.941476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.716 [2024-07-14 02:56:21.941505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.716 [2024-07-14 02:56:21.941554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.716 [2024-07-14 02:56:21.941571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.716 #30 NEW cov: 11756 ft: 14049 corp: 20/273b lim: 25 exec/s: 30 rss: 68Mb L: 12/25 MS: 1 ChangeBinInt- 00:08:26.976 [2024-07-14 02:56:21.981664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.976 [2024-07-14 02:56:21.981692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.976 [2024-07-14 02:56:21.981748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.976 [2024-07-14 02:56:21.981764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.976 #31 NEW cov: 11756 ft: 14062 corp: 21/286b lim: 25 exec/s: 31 rss: 68Mb L: 13/25 MS: 1 ChangeASCIIInt- 00:08:26.976 [2024-07-14 02:56:22.022120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.976 [2024-07-14 02:56:22.022148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.976 [2024-07-14 02:56:22.022202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.976 [2024-07-14 02:56:22.022219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.976 [2024-07-14 02:56:22.022278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:26.977 [2024-07-14 02:56:22.022295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.022352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:26.977 [2024-07-14 02:56:22.022367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.022426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:26.977 [2024-07-14 02:56:22.022446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:26.977 #32 NEW cov: 11756 ft: 14108 corp: 22/311b lim: 25 exec/s: 32 rss: 68Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:26.977 [2024-07-14 02:56:22.061873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-07-14 02:56:22.061901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.061942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-07-14 02:56:22.061959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 #33 NEW cov: 11756 ft: 14125 corp: 23/321b lim: 25 exec/s: 33 rss: 68Mb L: 10/25 MS: 1 ChangeBinInt- 00:08:26.977 [2024-07-14 02:56:22.102410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-07-14 02:56:22.102439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.102491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-07-14 02:56:22.102508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.102565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:26.977 [2024-07-14 02:56:22.102580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.102639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:26.977 [2024-07-14 02:56:22.102656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.977 #34 NEW cov: 11756 ft: 14168 corp: 24/345b lim: 25 exec/s: 34 rss: 68Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:26.977 [2024-07-14 02:56:22.142131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-07-14 02:56:22.142160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.142222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-07-14 02:56:22.142237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 #35 NEW cov: 11756 ft: 14227 corp: 25/357b lim: 25 exec/s: 35 rss: 68Mb L: 12/25 MS: 1 ChangeBinInt- 00:08:26.977 [2024-07-14 02:56:22.172213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-07-14 02:56:22.172240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.172291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-07-14 02:56:22.172308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 #36 NEW cov: 11756 ft: 14268 corp: 26/369b lim: 25 exec/s: 36 rss: 68Mb L: 12/25 MS: 1 ShuffleBytes- 00:08:26.977 [2024-07-14 02:56:22.212741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-07-14 02:56:22.212769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.212819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-07-14 02:56:22.212834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.212891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:26.977 [2024-07-14 02:56:22.212907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.212961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:26.977 [2024-07-14 02:56:22.212976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.977 [2024-07-14 02:56:22.213034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:26.977 [2024-07-14 02:56:22.213049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.236 #37 NEW cov: 11756 ft: 14281 corp: 27/394b lim: 25 exec/s: 37 rss: 68Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:27.236 [2024-07-14 02:56:22.252356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.236 [2024-07-14 02:56:22.252383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.236 #38 NEW cov: 11756 ft: 14659 corp: 28/403b lim: 25 exec/s: 38 rss: 68Mb L: 9/25 MS: 1 EraseBytes- 00:08:27.236 [2024-07-14 02:56:22.292689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.236 [2024-07-14 02:56:22.292718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.236 [2024-07-14 02:56:22.292758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.236 [2024-07-14 02:56:22.292775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.236 [2024-07-14 02:56:22.292832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.236 [2024-07-14 02:56:22.292849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.237 #39 NEW cov: 11756 ft: 14863 corp: 29/421b lim: 25 exec/s: 39 rss: 68Mb L: 18/25 MS: 1 InsertRepeatedBytes- 00:08:27.237 [2024-07-14 02:56:22.332556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.237 [2024-07-14 02:56:22.332583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.237 #40 NEW cov: 11756 ft: 14875 corp: 30/430b lim: 25 exec/s: 40 rss: 68Mb L: 9/25 MS: 1 ChangeBinInt- 00:08:27.237 [2024-07-14 02:56:22.373036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.237 [2024-07-14 02:56:22.373064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.373113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.237 [2024-07-14 02:56:22.373128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.373186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.237 [2024-07-14 02:56:22.373201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.373259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:27.237 [2024-07-14 02:56:22.373275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.237 #41 NEW cov: 11756 ft: 14898 corp: 31/454b lim: 25 exec/s: 41 rss: 68Mb L: 24/25 MS: 1 ChangeByte- 00:08:27.237 [2024-07-14 02:56:22.413280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.237 [2024-07-14 02:56:22.413310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.413360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.237 [2024-07-14 02:56:22.413376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.413435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.237 [2024-07-14 02:56:22.413460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.413518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:27.237 [2024-07-14 02:56:22.413535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.413603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:27.237 [2024-07-14 02:56:22.413619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.237 #42 NEW cov: 11756 ft: 14903 corp: 32/479b lim: 25 exec/s: 42 rss: 68Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:27.237 [2024-07-14 02:56:22.453035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.237 [2024-07-14 02:56:22.453063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.453123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.237 [2024-07-14 02:56:22.453141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.237 #43 NEW cov: 11756 ft: 14909 corp: 33/491b lim: 25 exec/s: 43 rss: 68Mb L: 12/25 MS: 1 ShuffleBytes- 00:08:27.237 [2024-07-14 02:56:22.483082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.237 [2024-07-14 02:56:22.483112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.237 [2024-07-14 02:56:22.483167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.237 [2024-07-14 02:56:22.483182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.497 #44 NEW cov: 11756 ft: 14936 corp: 34/501b lim: 25 exec/s: 44 rss: 68Mb L: 10/25 MS: 1 ShuffleBytes- 00:08:27.497 [2024-07-14 02:56:22.523484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.497 [2024-07-14 02:56:22.523519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.497 [2024-07-14 02:56:22.523566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.497 [2024-07-14 02:56:22.523583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.497 [2024-07-14 02:56:22.523642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.497 [2024-07-14 02:56:22.523659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.497 [2024-07-14 02:56:22.523717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:27.497 [2024-07-14 02:56:22.523734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.497 #45 NEW cov: 11756 ft: 14949 corp: 35/523b lim: 25 exec/s: 45 rss: 68Mb L: 22/25 MS: 1 EraseBytes- 00:08:27.497 [2024-07-14 02:56:22.553461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.498 [2024-07-14 02:56:22.553490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.553529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.498 [2024-07-14 02:56:22.553544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.553601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.498 [2024-07-14 02:56:22.553616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.498 #46 NEW cov: 11756 ft: 14958 corp: 36/541b lim: 25 exec/s: 46 rss: 69Mb L: 18/25 MS: 1 PersAutoDict- DE: "\376\003\000\000\000\000\000\000"- 00:08:27.498 [2024-07-14 02:56:22.593486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.498 [2024-07-14 02:56:22.593514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.593554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.498 [2024-07-14 02:56:22.593571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.498 #47 NEW cov: 11756 ft: 14967 corp: 37/553b lim: 25 exec/s: 47 rss: 69Mb L: 12/25 MS: 1 ChangeBit- 00:08:27.498 [2024-07-14 02:56:22.623901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.498 [2024-07-14 02:56:22.623930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.623984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.498 [2024-07-14 02:56:22.624000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.624063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.498 [2024-07-14 02:56:22.624080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.624136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:27.498 [2024-07-14 02:56:22.624152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.624211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:27.498 [2024-07-14 02:56:22.624228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.498 #48 NEW cov: 11756 ft: 14980 corp: 38/578b lim: 25 exec/s: 48 rss: 69Mb L: 25/25 MS: 1 ChangeByte- 00:08:27.498 [2024-07-14 02:56:22.663665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.498 [2024-07-14 02:56:22.663694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.663745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.498 [2024-07-14 02:56:22.663762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.498 #49 NEW cov: 11756 ft: 14985 corp: 39/591b lim: 25 exec/s: 49 rss: 69Mb L: 13/25 MS: 1 ChangeByte- 00:08:27.498 [2024-07-14 02:56:22.703886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.498 [2024-07-14 02:56:22.703915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.703955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.498 [2024-07-14 02:56:22.703971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.704032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.498 [2024-07-14 02:56:22.704049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.498 #50 NEW cov: 11756 ft: 14990 corp: 40/608b lim: 25 exec/s: 50 rss: 69Mb L: 17/25 MS: 1 EraseBytes- 00:08:27.498 [2024-07-14 02:56:22.744260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.498 [2024-07-14 02:56:22.744289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.744345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.498 [2024-07-14 02:56:22.744363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.744421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.498 [2024-07-14 02:56:22.744437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.744505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:27.498 [2024-07-14 02:56:22.744522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.498 [2024-07-14 02:56:22.744580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:27.498 [2024-07-14 02:56:22.744610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.758 #51 NEW cov: 11756 ft: 15001 corp: 41/633b lim: 25 exec/s: 51 rss: 69Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:27.758 [2024-07-14 02:56:22.783935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.758 [2024-07-14 02:56:22.783965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.784007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.758 [2024-07-14 02:56:22.784023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.758 #52 NEW cov: 11756 ft: 15012 corp: 42/643b lim: 25 exec/s: 52 rss: 69Mb L: 10/25 MS: 1 CrossOver- 00:08:27.758 [2024-07-14 02:56:22.824461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.758 [2024-07-14 02:56:22.824490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.824544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.758 [2024-07-14 02:56:22.824560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.824627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.758 [2024-07-14 02:56:22.824642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.824696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:27.758 [2024-07-14 02:56:22.824712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.824767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:27.758 [2024-07-14 02:56:22.824782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.758 #53 NEW cov: 11756 ft: 15014 corp: 43/668b lim: 25 exec/s: 53 rss: 69Mb L: 25/25 MS: 1 PersAutoDict- DE: "\376\003\000\000\000\000\000\000"- 00:08:27.758 [2024-07-14 02:56:22.864386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.758 [2024-07-14 02:56:22.864414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.864460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.758 [2024-07-14 02:56:22.864476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.864535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.758 [2024-07-14 02:56:22.864552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.758 #54 NEW cov: 11756 ft: 15053 corp: 44/687b lim: 25 exec/s: 54 rss: 69Mb L: 19/25 MS: 1 EraseBytes- 00:08:27.758 [2024-07-14 02:56:22.904743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.758 [2024-07-14 02:56:22.904771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.904826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.758 [2024-07-14 02:56:22.904839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.904893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:27.758 [2024-07-14 02:56:22.904909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.904962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:27.758 [2024-07-14 02:56:22.904977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.758 [2024-07-14 02:56:22.905031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:27.758 [2024-07-14 02:56:22.905046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.759 [2024-07-14 02:56:22.944455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.759 [2024-07-14 02:56:22.944483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.759 [2024-07-14 02:56:22.944526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.759 [2024-07-14 02:56:22.944542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.759 #56 NEW cov: 11756 ft: 15106 corp: 45/701b lim: 25 exec/s: 28 rss: 69Mb L: 14/25 MS: 2 ChangeByte-EraseBytes- 00:08:27.759 #56 DONE cov: 11756 ft: 15106 corp: 45/701b lim: 25 exec/s: 28 rss: 69Mb 00:08:27.759 ###### Recommended dictionary. ###### 00:08:27.759 "\377\377" # Uses: 0 00:08:27.759 "\376\003\000\000\000\000\000\000" # Uses: 2 00:08:27.759 ###### End of recommended dictionary. ###### 00:08:27.759 Done 56 runs in 2 second(s) 00:08:28.018 02:56:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:28.018 02:56:23 -- ../common.sh@72 -- # (( i++ )) 00:08:28.018 02:56:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.018 02:56:23 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:28.018 02:56:23 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:28.018 02:56:23 -- nvmf/run.sh@24 -- # local timen=1 00:08:28.018 02:56:23 -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.018 02:56:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:28.018 02:56:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:28.018 02:56:23 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:28.018 02:56:23 -- nvmf/run.sh@29 -- # port=4424 00:08:28.018 02:56:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:28.018 02:56:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:28.019 02:56:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.019 02:56:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:28.019 [2024-07-14 02:56:23.120598] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:28.019 [2024-07-14 02:56:23.120676] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid677238 ] 00:08:28.019 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.277 [2024-07-14 02:56:23.302239] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.277 [2024-07-14 02:56:23.322436] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:28.277 [2024-07-14 02:56:23.322565] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.277 [2024-07-14 02:56:23.374131] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.277 [2024-07-14 02:56:23.390433] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:28.277 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.277 INFO: Seed: 1933488641 00:08:28.277 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:28.277 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:28.277 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:28.277 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.277 #2 INITED exec/s: 0 rss: 59Mb 00:08:28.277 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.277 This may also happen if the target rejected all inputs we tried so far 00:08:28.277 [2024-07-14 02:56:23.456637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.277 [2024-07-14 02:56:23.456672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.277 [2024-07-14 02:56:23.456789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.277 [2024-07-14 02:56:23.456815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.277 [2024-07-14 02:56:23.456935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.277 [2024-07-14 02:56:23.456956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.536 NEW_FUNC[1/672]: 0x4bd130 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:28.536 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.536 #4 NEW cov: 11582 ft: 11599 corp: 2/76b lim: 100 exec/s: 0 rss: 66Mb L: 75/75 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:28.536 [2024-07-14 02:56:23.787886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.536 [2024-07-14 02:56:23.787924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.536 [2024-07-14 02:56:23.788050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.536 [2024-07-14 02:56:23.788073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.788201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.788221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.795 #5 NEW cov: 11714 ft: 12104 corp: 3/151b lim: 100 exec/s: 0 rss: 66Mb L: 75/75 MS: 1 ShuffleBytes- 00:08:28.795 [2024-07-14 02:56:23.847961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.848000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.848129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.848151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.848278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.848300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.795 #6 NEW cov: 11720 ft: 12375 corp: 4/226b lim: 100 exec/s: 0 rss: 66Mb L: 75/75 MS: 1 ChangeBinInt- 00:08:28.795 [2024-07-14 02:56:23.898684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.898719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.898793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.898818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.898913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.898937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.899067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.899086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.899212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.899236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.795 #7 NEW cov: 11805 ft: 13078 corp: 5/326b lim: 100 exec/s: 0 rss: 66Mb L: 100/100 MS: 1 CopyPart- 00:08:28.795 [2024-07-14 02:56:23.958887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.958918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.958994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.959017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.795 [2024-07-14 02:56:23.959143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.795 [2024-07-14 02:56:23.959165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.796 [2024-07-14 02:56:23.959292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.796 [2024-07-14 02:56:23.959317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.796 [2024-07-14 02:56:23.959439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.796 [2024-07-14 02:56:23.959465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.796 #8 NEW cov: 11805 ft: 13224 corp: 6/426b lim: 100 exec/s: 0 rss: 66Mb L: 100/100 MS: 1 CrossOver- 00:08:28.796 [2024-07-14 02:56:24.018968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.796 [2024-07-14 02:56:24.019000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.796 [2024-07-14 02:56:24.019093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.796 [2024-07-14 02:56:24.019122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.796 [2024-07-14 02:56:24.019245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.796 [2024-07-14 02:56:24.019271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.796 [2024-07-14 02:56:24.019388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.796 [2024-07-14 02:56:24.019411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.796 [2024-07-14 02:56:24.019541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.796 [2024-07-14 02:56:24.019565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.796 #9 NEW cov: 11805 ft: 13305 corp: 7/526b lim: 100 exec/s: 0 rss: 66Mb L: 100/100 MS: 1 CopyPart- 00:08:29.055 [2024-07-14 02:56:24.078728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.078760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.055 [2024-07-14 02:56:24.078849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.078874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.055 [2024-07-14 02:56:24.079020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.079041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.055 #10 NEW cov: 11805 ft: 13383 corp: 8/589b lim: 100 exec/s: 0 rss: 67Mb L: 63/100 MS: 1 EraseBytes- 00:08:29.055 [2024-07-14 02:56:24.129292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.129324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.055 [2024-07-14 02:56:24.129401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.129427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.055 [2024-07-14 02:56:24.129551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.129570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.055 [2024-07-14 02:56:24.129698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.129723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.055 [2024-07-14 02:56:24.129847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.129867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.055 #11 NEW cov: 11805 ft: 13469 corp: 9/689b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 CrossOver- 00:08:29.055 [2024-07-14 02:56:24.179258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069834014719 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.179291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.055 [2024-07-14 02:56:24.179368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.055 [2024-07-14 02:56:24.179394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.179525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.179544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.179666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.179685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.056 #14 NEW cov: 11805 ft: 13494 corp: 10/779b lim: 100 exec/s: 0 rss: 67Mb L: 90/100 MS: 3 InsertRepeatedBytes-ChangeBit-InsertRepeatedBytes- 00:08:29.056 [2024-07-14 02:56:24.229763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.229796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.229878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.229905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.230033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.230059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.230197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.230225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.230363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.230384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.056 #15 NEW cov: 11805 ft: 13564 corp: 11/879b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 CrossOver- 00:08:29.056 [2024-07-14 02:56:24.279822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.279852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.279941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.279959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.280084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.280107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.280231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.280253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.056 [2024-07-14 02:56:24.280376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:20240 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.056 [2024-07-14 02:56:24.280401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.315 #16 NEW cov: 11805 ft: 13579 corp: 12/979b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 ChangeBit- 00:08:29.315 [2024-07-14 02:56:24.340080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.340115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.315 [2024-07-14 02:56:24.340205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.340232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.315 [2024-07-14 02:56:24.340354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.340396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.315 [2024-07-14 02:56:24.340532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.340551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.315 [2024-07-14 02:56:24.340681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.340704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.315 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.315 #17 NEW cov: 11828 ft: 13637 corp: 13/1079b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:29.315 [2024-07-14 02:56:24.389666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.389698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.315 [2024-07-14 02:56:24.389844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.389872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.315 [2024-07-14 02:56:24.390005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.390034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.315 #18 NEW cov: 11828 ft: 13646 corp: 14/1148b lim: 100 exec/s: 0 rss: 67Mb L: 69/100 MS: 1 EraseBytes- 00:08:29.315 [2024-07-14 02:56:24.440388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.440423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.315 [2024-07-14 02:56:24.440513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.315 [2024-07-14 02:56:24.440538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.315 [2024-07-14 02:56:24.440665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.440690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.316 [2024-07-14 02:56:24.440819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.440843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.316 [2024-07-14 02:56:24.440973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.440999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.316 #19 NEW cov: 11828 ft: 13680 corp: 15/1248b lim: 100 exec/s: 19 rss: 67Mb L: 100/100 MS: 1 ChangeBit- 00:08:29.316 [2024-07-14 02:56:24.489800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.489835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.316 [2024-07-14 02:56:24.489976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.490000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.316 #20 NEW cov: 11828 ft: 14022 corp: 16/1307b lim: 100 exec/s: 20 rss: 67Mb L: 59/100 MS: 1 EraseBytes- 00:08:29.316 [2024-07-14 02:56:24.550807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.550841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.316 [2024-07-14 02:56:24.550915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.550946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.316 [2024-07-14 02:56:24.551077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.551100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.316 [2024-07-14 02:56:24.551234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.551248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.316 [2024-07-14 02:56:24.551381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.316 [2024-07-14 02:56:24.551405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.575 #21 NEW cov: 11828 ft: 14048 corp: 17/1407b lim: 100 exec/s: 21 rss: 67Mb L: 100/100 MS: 1 CopyPart- 00:08:29.575 [2024-07-14 02:56:24.610434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.610472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.610573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592572985103 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.610597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.610732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.610749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.575 #22 NEW cov: 11828 ft: 14080 corp: 18/1483b lim: 100 exec/s: 22 rss: 67Mb L: 76/100 MS: 1 InsertByte- 00:08:29.575 [2024-07-14 02:56:24.660361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592498077455 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.660394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.660500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3850 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.660523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.575 #27 NEW cov: 11828 ft: 14108 corp: 19/1539b lim: 100 exec/s: 27 rss: 68Mb L: 56/100 MS: 5 PersAutoDict-EraseBytes-EraseBytes-ChangeByte-CrossOver- DE: "\003\000\000\000\000\000\000\000"- 00:08:29.575 [2024-07-14 02:56:24.711331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.711364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.711462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.711489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.711616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.711641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.711774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.711798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.711919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.711943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.575 #28 NEW cov: 11828 ft: 14122 corp: 20/1639b lim: 100 exec/s: 28 rss: 68Mb L: 100/100 MS: 1 ChangeBinInt- 00:08:29.575 [2024-07-14 02:56:24.761483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.761513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.761617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.761639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.761776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.761797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.761930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.761950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.762083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.762108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.575 #29 NEW cov: 11828 ft: 14128 corp: 21/1739b lim: 100 exec/s: 29 rss: 68Mb L: 100/100 MS: 1 ChangeByte- 00:08:29.575 [2024-07-14 02:56:24.821530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.821563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.821649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592572985103 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.821676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.821823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.575 [2024-07-14 02:56:24.821849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.575 [2024-07-14 02:56:24.821979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.576 [2024-07-14 02:56:24.822008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.836 #30 NEW cov: 11828 ft: 14160 corp: 22/1836b lim: 100 exec/s: 30 rss: 68Mb L: 97/100 MS: 1 CrossOver- 00:08:29.836 [2024-07-14 02:56:24.881075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:24.881111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:24.881262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:24.881287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.836 #31 NEW cov: 11828 ft: 14179 corp: 23/1882b lim: 100 exec/s: 31 rss: 68Mb L: 46/100 MS: 1 EraseBytes- 00:08:29.836 [2024-07-14 02:56:24.931449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:24.931482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:24.931591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:24.931618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:24.931754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:24.931779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.836 #32 NEW cov: 11828 ft: 14211 corp: 24/1957b lim: 100 exec/s: 32 rss: 68Mb L: 75/100 MS: 1 CopyPart- 00:08:29.836 [2024-07-14 02:56:24.991718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:24.991752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:24.991875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:24.991896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:24.992027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:24.992049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.836 #33 NEW cov: 11828 ft: 14235 corp: 25/2021b lim: 100 exec/s: 33 rss: 68Mb L: 64/100 MS: 1 EraseBytes- 00:08:29.836 [2024-07-14 02:56:25.042437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:25.042480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:25.042575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:25.042599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:25.042724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:25.042748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:25.042873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:25.042899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.836 [2024-07-14 02:56:25.043026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.836 [2024-07-14 02:56:25.043050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.836 #34 NEW cov: 11828 ft: 14247 corp: 26/2121b lim: 100 exec/s: 34 rss: 68Mb L: 100/100 MS: 1 ChangeByte- 00:08:30.096 [2024-07-14 02:56:25.102595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.102636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.102736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.102766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.102897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.102925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.103054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.103081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.103184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.103208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.096 #35 NEW cov: 11828 ft: 14263 corp: 27/2221b lim: 100 exec/s: 35 rss: 68Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:30.096 [2024-07-14 02:56:25.162480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.162516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.162611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.162635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.162784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.162803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.162934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.162956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.096 #36 NEW cov: 11828 ft: 14276 corp: 28/2312b lim: 100 exec/s: 36 rss: 68Mb L: 91/100 MS: 1 EraseBytes- 00:08:30.096 [2024-07-14 02:56:25.212390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.212424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.212537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.212565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.212703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.212728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.096 #37 NEW cov: 11828 ft: 14288 corp: 29/2381b lim: 100 exec/s: 37 rss: 68Mb L: 69/100 MS: 1 CrossOver- 00:08:30.096 [2024-07-14 02:56:25.262258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571208184 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.262292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.262437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.262466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.096 #38 NEW cov: 11828 ft: 14295 corp: 30/2427b lim: 100 exec/s: 38 rss: 68Mb L: 46/100 MS: 1 ChangeBinInt- 00:08:30.096 [2024-07-14 02:56:25.312715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.312750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.312865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.312887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.096 [2024-07-14 02:56:25.313018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.096 [2024-07-14 02:56:25.313041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.096 #39 NEW cov: 11828 ft: 14365 corp: 31/2502b lim: 100 exec/s: 39 rss: 69Mb L: 75/100 MS: 1 ChangeBinInt- 00:08:30.356 [2024-07-14 02:56:25.363415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.356 [2024-07-14 02:56:25.363448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.356 [2024-07-14 02:56:25.363543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.356 [2024-07-14 02:56:25.363569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.356 [2024-07-14 02:56:25.363700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.356 [2024-07-14 02:56:25.363726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.356 [2024-07-14 02:56:25.363855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.356 [2024-07-14 02:56:25.363881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.356 [2024-07-14 02:56:25.364011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.356 [2024-07-14 02:56:25.364036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.356 #40 NEW cov: 11828 ft: 14372 corp: 32/2602b lim: 100 exec/s: 40 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:08:30.356 [2024-07-14 02:56:25.423088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.356 [2024-07-14 02:56:25.423120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.356 [2024-07-14 02:56:25.423251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.356 [2024-07-14 02:56:25.423280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.356 [2024-07-14 02:56:25.423411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.356 [2024-07-14 02:56:25.423438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.356 #41 NEW cov: 11828 ft: 14396 corp: 33/2677b lim: 100 exec/s: 20 rss: 69Mb L: 75/100 MS: 1 CMP- DE: "\000\014"- 00:08:30.356 #41 DONE cov: 11828 ft: 14396 corp: 33/2677b lim: 100 exec/s: 20 rss: 69Mb 00:08:30.356 ###### Recommended dictionary. ###### 00:08:30.356 "\003\000\000\000\000\000\000\000" # Uses: 1 00:08:30.356 "\000\014" # Uses: 0 00:08:30.356 ###### End of recommended dictionary. ###### 00:08:30.356 Done 41 runs in 2 second(s) 00:08:30.356 02:56:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:30.356 02:56:25 -- ../common.sh@72 -- # (( i++ )) 00:08:30.356 02:56:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.356 02:56:25 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:30.356 00:08:30.356 real 1m2.846s 00:08:30.356 user 1m38.535s 00:08:30.357 sys 0m7.719s 00:08:30.357 02:56:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.357 02:56:25 -- common/autotest_common.sh@10 -- # set +x 00:08:30.357 ************************************ 00:08:30.357 END TEST nvmf_fuzz 00:08:30.357 ************************************ 00:08:30.357 02:56:25 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:30.357 02:56:25 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:30.357 02:56:25 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:30.357 02:56:25 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:30.357 02:56:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:30.357 02:56:25 -- common/autotest_common.sh@10 -- # set +x 00:08:30.357 ************************************ 00:08:30.357 START TEST vfio_fuzz 00:08:30.357 ************************************ 00:08:30.618 02:56:25 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:30.618 * Looking for test storage... 00:08:30.618 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:30.618 02:56:25 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:30.618 02:56:25 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:30.618 02:56:25 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:30.618 02:56:25 -- common/autotest_common.sh@34 -- # set -e 00:08:30.618 02:56:25 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:30.618 02:56:25 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:30.618 02:56:25 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:30.618 02:56:25 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:30.618 02:56:25 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:30.618 02:56:25 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:30.618 02:56:25 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:30.618 02:56:25 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:30.618 02:56:25 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:30.618 02:56:25 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:30.618 02:56:25 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:30.618 02:56:25 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:30.618 02:56:25 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:30.618 02:56:25 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:30.618 02:56:25 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:30.618 02:56:25 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:30.618 02:56:25 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:30.618 02:56:25 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:30.618 02:56:25 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:30.618 02:56:25 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:30.618 02:56:25 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:30.618 02:56:25 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:30.618 02:56:25 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:30.618 02:56:25 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:30.618 02:56:25 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:30.619 02:56:25 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:30.619 02:56:25 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:30.619 02:56:25 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:30.619 02:56:25 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:30.619 02:56:25 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:30.619 02:56:25 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:30.619 02:56:25 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:30.619 02:56:25 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:30.619 02:56:25 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:30.619 02:56:25 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:30.619 02:56:25 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:30.619 02:56:25 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:30.619 02:56:25 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:30.619 02:56:25 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:30.619 02:56:25 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:30.619 02:56:25 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:30.619 02:56:25 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:30.619 02:56:25 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:30.619 02:56:25 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:30.619 02:56:25 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:30.619 02:56:25 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:30.619 02:56:25 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:30.619 02:56:25 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:30.619 02:56:25 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:30.619 02:56:25 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:30.619 02:56:25 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:30.619 02:56:25 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:30.619 02:56:25 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:30.619 02:56:25 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:30.619 02:56:25 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:30.619 02:56:25 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:30.619 02:56:25 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:30.619 02:56:25 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:30.619 02:56:25 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:30.619 02:56:25 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:30.619 02:56:25 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:30.619 02:56:25 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:30.619 02:56:25 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:30.619 02:56:25 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:30.619 02:56:25 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:30.619 02:56:25 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:30.619 02:56:25 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:30.619 02:56:25 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:30.619 02:56:25 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:30.619 02:56:25 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:30.619 02:56:25 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:30.619 02:56:25 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:30.619 02:56:25 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:30.619 02:56:25 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:30.619 02:56:25 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:30.619 02:56:25 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:30.619 02:56:25 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:30.619 02:56:25 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:30.619 02:56:25 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:30.619 02:56:25 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:30.619 02:56:25 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:30.619 02:56:25 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:30.619 02:56:25 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:30.619 02:56:25 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:30.619 02:56:25 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:30.619 02:56:25 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:30.619 02:56:25 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:30.619 02:56:25 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:30.619 02:56:25 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:30.619 02:56:25 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:30.619 02:56:25 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:30.619 02:56:25 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:30.619 02:56:25 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:30.619 02:56:25 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:30.619 02:56:25 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:30.619 02:56:25 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:30.619 02:56:25 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:30.619 02:56:25 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:30.619 02:56:25 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:30.619 #define SPDK_CONFIG_H 00:08:30.619 #define SPDK_CONFIG_APPS 1 00:08:30.619 #define SPDK_CONFIG_ARCH native 00:08:30.619 #undef SPDK_CONFIG_ASAN 00:08:30.619 #undef SPDK_CONFIG_AVAHI 00:08:30.619 #undef SPDK_CONFIG_CET 00:08:30.619 #define SPDK_CONFIG_COVERAGE 1 00:08:30.619 #define SPDK_CONFIG_CROSS_PREFIX 00:08:30.619 #undef SPDK_CONFIG_CRYPTO 00:08:30.619 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:30.619 #undef SPDK_CONFIG_CUSTOMOCF 00:08:30.619 #undef SPDK_CONFIG_DAOS 00:08:30.619 #define SPDK_CONFIG_DAOS_DIR 00:08:30.619 #define SPDK_CONFIG_DEBUG 1 00:08:30.619 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:30.619 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:30.619 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:30.619 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:30.619 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:30.619 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:30.619 #define SPDK_CONFIG_EXAMPLES 1 00:08:30.619 #undef SPDK_CONFIG_FC 00:08:30.619 #define SPDK_CONFIG_FC_PATH 00:08:30.619 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:30.619 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:30.619 #undef SPDK_CONFIG_FUSE 00:08:30.619 #define SPDK_CONFIG_FUZZER 1 00:08:30.619 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:30.619 #undef SPDK_CONFIG_GOLANG 00:08:30.619 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:30.619 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:30.619 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:30.619 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:30.619 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:30.619 #define SPDK_CONFIG_IDXD 1 00:08:30.619 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:30.619 #undef SPDK_CONFIG_IPSEC_MB 00:08:30.619 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:30.619 #define SPDK_CONFIG_ISAL 1 00:08:30.619 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:30.619 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:30.619 #define SPDK_CONFIG_LIBDIR 00:08:30.619 #undef SPDK_CONFIG_LTO 00:08:30.619 #define SPDK_CONFIG_MAX_LCORES 00:08:30.619 #define SPDK_CONFIG_NVME_CUSE 1 00:08:30.619 #undef SPDK_CONFIG_OCF 00:08:30.620 #define SPDK_CONFIG_OCF_PATH 00:08:30.620 #define SPDK_CONFIG_OPENSSL_PATH 00:08:30.620 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:30.620 #undef SPDK_CONFIG_PGO_USE 00:08:30.620 #define SPDK_CONFIG_PREFIX /usr/local 00:08:30.620 #undef SPDK_CONFIG_RAID5F 00:08:30.620 #undef SPDK_CONFIG_RBD 00:08:30.620 #define SPDK_CONFIG_RDMA 1 00:08:30.620 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:30.620 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:30.620 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:30.620 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:30.620 #undef SPDK_CONFIG_SHARED 00:08:30.620 #undef SPDK_CONFIG_SMA 00:08:30.620 #define SPDK_CONFIG_TESTS 1 00:08:30.620 #undef SPDK_CONFIG_TSAN 00:08:30.620 #define SPDK_CONFIG_UBLK 1 00:08:30.620 #define SPDK_CONFIG_UBSAN 1 00:08:30.620 #undef SPDK_CONFIG_UNIT_TESTS 00:08:30.620 #undef SPDK_CONFIG_URING 00:08:30.620 #define SPDK_CONFIG_URING_PATH 00:08:30.620 #undef SPDK_CONFIG_URING_ZNS 00:08:30.620 #undef SPDK_CONFIG_USDT 00:08:30.620 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:30.620 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:30.620 #define SPDK_CONFIG_VFIO_USER 1 00:08:30.620 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:30.620 #define SPDK_CONFIG_VHOST 1 00:08:30.620 #define SPDK_CONFIG_VIRTIO 1 00:08:30.620 #undef SPDK_CONFIG_VTUNE 00:08:30.620 #define SPDK_CONFIG_VTUNE_DIR 00:08:30.620 #define SPDK_CONFIG_WERROR 1 00:08:30.620 #define SPDK_CONFIG_WPDK_DIR 00:08:30.620 #undef SPDK_CONFIG_XNVME 00:08:30.620 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:30.620 02:56:25 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:30.620 02:56:25 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:30.620 02:56:25 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:30.620 02:56:25 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:30.620 02:56:25 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:30.620 02:56:25 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:30.620 02:56:25 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:30.620 02:56:25 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:30.620 02:56:25 -- paths/export.sh@5 -- # export PATH 00:08:30.620 02:56:25 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:30.620 02:56:25 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:30.620 02:56:25 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:30.620 02:56:25 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:30.620 02:56:25 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:30.620 02:56:25 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:30.620 02:56:25 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:30.620 02:56:25 -- pm/common@16 -- # TEST_TAG=N/A 00:08:30.620 02:56:25 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:30.620 02:56:25 -- common/autotest_common.sh@52 -- # : 1 00:08:30.620 02:56:25 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:30.620 02:56:25 -- common/autotest_common.sh@56 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:30.620 02:56:25 -- common/autotest_common.sh@58 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:30.620 02:56:25 -- common/autotest_common.sh@60 -- # : 1 00:08:30.620 02:56:25 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:30.620 02:56:25 -- common/autotest_common.sh@62 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:30.620 02:56:25 -- common/autotest_common.sh@64 -- # : 00:08:30.620 02:56:25 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:30.620 02:56:25 -- common/autotest_common.sh@66 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:30.620 02:56:25 -- common/autotest_common.sh@68 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:30.620 02:56:25 -- common/autotest_common.sh@70 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:30.620 02:56:25 -- common/autotest_common.sh@72 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:30.620 02:56:25 -- common/autotest_common.sh@74 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:30.620 02:56:25 -- common/autotest_common.sh@76 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:30.620 02:56:25 -- common/autotest_common.sh@78 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:30.620 02:56:25 -- common/autotest_common.sh@80 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:30.620 02:56:25 -- common/autotest_common.sh@82 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:30.620 02:56:25 -- common/autotest_common.sh@84 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:30.620 02:56:25 -- common/autotest_common.sh@86 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:30.620 02:56:25 -- common/autotest_common.sh@88 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:30.620 02:56:25 -- common/autotest_common.sh@90 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:30.620 02:56:25 -- common/autotest_common.sh@92 -- # : 1 00:08:30.620 02:56:25 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:30.620 02:56:25 -- common/autotest_common.sh@94 -- # : 1 00:08:30.620 02:56:25 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:30.620 02:56:25 -- common/autotest_common.sh@96 -- # : rdma 00:08:30.620 02:56:25 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:30.620 02:56:25 -- common/autotest_common.sh@98 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:30.620 02:56:25 -- common/autotest_common.sh@100 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:30.620 02:56:25 -- common/autotest_common.sh@102 -- # : 0 00:08:30.620 02:56:25 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:30.620 02:56:25 -- common/autotest_common.sh@104 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:30.621 02:56:25 -- common/autotest_common.sh@106 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:30.621 02:56:25 -- common/autotest_common.sh@108 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:30.621 02:56:25 -- common/autotest_common.sh@110 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:30.621 02:56:25 -- common/autotest_common.sh@112 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:30.621 02:56:25 -- common/autotest_common.sh@114 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:30.621 02:56:25 -- common/autotest_common.sh@116 -- # : 1 00:08:30.621 02:56:25 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:30.621 02:56:25 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:30.621 02:56:25 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:30.621 02:56:25 -- common/autotest_common.sh@120 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:30.621 02:56:25 -- common/autotest_common.sh@122 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:30.621 02:56:25 -- common/autotest_common.sh@124 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:30.621 02:56:25 -- common/autotest_common.sh@126 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:30.621 02:56:25 -- common/autotest_common.sh@128 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:30.621 02:56:25 -- common/autotest_common.sh@130 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:30.621 02:56:25 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:30.621 02:56:25 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:30.621 02:56:25 -- common/autotest_common.sh@134 -- # : true 00:08:30.621 02:56:25 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:30.621 02:56:25 -- common/autotest_common.sh@136 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:30.621 02:56:25 -- common/autotest_common.sh@138 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:30.621 02:56:25 -- common/autotest_common.sh@140 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:30.621 02:56:25 -- common/autotest_common.sh@142 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:30.621 02:56:25 -- common/autotest_common.sh@144 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:30.621 02:56:25 -- common/autotest_common.sh@146 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:30.621 02:56:25 -- common/autotest_common.sh@148 -- # : 00:08:30.621 02:56:25 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:30.621 02:56:25 -- common/autotest_common.sh@150 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:30.621 02:56:25 -- common/autotest_common.sh@152 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:30.621 02:56:25 -- common/autotest_common.sh@154 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:30.621 02:56:25 -- common/autotest_common.sh@156 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:30.621 02:56:25 -- common/autotest_common.sh@158 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:30.621 02:56:25 -- common/autotest_common.sh@160 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:30.621 02:56:25 -- common/autotest_common.sh@163 -- # : 00:08:30.621 02:56:25 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:30.621 02:56:25 -- common/autotest_common.sh@165 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:30.621 02:56:25 -- common/autotest_common.sh@167 -- # : 0 00:08:30.621 02:56:25 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:30.621 02:56:25 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:30.621 02:56:25 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:30.621 02:56:25 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:30.621 02:56:25 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:30.621 02:56:25 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:30.621 02:56:25 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:30.621 02:56:25 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:30.621 02:56:25 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:30.621 02:56:25 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:30.621 02:56:25 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:30.621 02:56:25 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:30.621 02:56:25 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:30.621 02:56:25 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:30.621 02:56:25 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:30.621 02:56:25 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:30.621 02:56:25 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:30.621 02:56:25 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:30.621 02:56:25 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:30.622 02:56:25 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:30.622 02:56:25 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:30.622 02:56:25 -- common/autotest_common.sh@196 -- # cat 00:08:30.622 02:56:25 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:30.622 02:56:25 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:30.622 02:56:25 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:30.622 02:56:25 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:30.622 02:56:25 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:30.622 02:56:25 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:30.622 02:56:25 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:30.622 02:56:25 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:30.622 02:56:25 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:30.622 02:56:25 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:30.622 02:56:25 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:30.622 02:56:25 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:30.622 02:56:25 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:30.622 02:56:25 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:30.622 02:56:25 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:30.622 02:56:25 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:30.622 02:56:25 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:30.622 02:56:25 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:30.622 02:56:25 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:30.622 02:56:25 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:30.622 02:56:25 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:30.622 02:56:25 -- common/autotest_common.sh@249 -- # valgrind= 00:08:30.622 02:56:25 -- common/autotest_common.sh@255 -- # uname -s 00:08:30.622 02:56:25 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:30.622 02:56:25 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:30.622 02:56:25 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:30.622 02:56:25 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:30.622 02:56:25 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:30.622 02:56:25 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:30.622 02:56:25 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:30.622 02:56:25 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:30.622 02:56:25 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:30.622 02:56:25 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:30.622 02:56:25 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:30.622 02:56:25 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:30.622 02:56:25 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:30.622 02:56:25 -- common/autotest_common.sh@309 -- # [[ -z 677757 ]] 00:08:30.622 02:56:25 -- common/autotest_common.sh@309 -- # kill -0 677757 00:08:30.622 02:56:25 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:30.622 02:56:25 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:30.622 02:56:25 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:30.622 02:56:25 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:30.622 02:56:25 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:30.622 02:56:25 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:30.622 02:56:25 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:30.622 02:56:25 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:30.622 02:56:25 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.UMP5oy 00:08:30.622 02:56:25 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:30.622 02:56:25 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:30.622 02:56:25 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:30.622 02:56:25 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.UMP5oy/tests/vfio /tmp/spdk.UMP5oy 00:08:30.622 02:56:25 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:30.622 02:56:25 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:30.622 02:56:25 -- common/autotest_common.sh@318 -- # df -T 00:08:30.622 02:56:25 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:30.622 02:56:25 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:30.622 02:56:25 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:30.622 02:56:25 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:08:30.622 02:56:25 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # avails["$mount"]=53077458944 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:08:30.622 02:56:25 -- common/autotest_common.sh@354 -- # uses["$mount"]=8664858624 00:08:30.622 02:56:25 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:30.622 02:56:25 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:08:30.622 02:56:25 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342484992 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:08:30.622 02:56:25 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:08:30.622 02:56:25 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870515712 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:30.622 02:56:25 -- common/autotest_common.sh@354 -- # uses["$mount"]=643072 00:08:30.622 02:56:25 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:08:30.622 02:56:25 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:08:30.622 02:56:25 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:08:30.622 02:56:25 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:30.622 02:56:25 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:30.622 * Looking for test storage... 00:08:30.622 02:56:25 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:30.622 02:56:25 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:30.622 02:56:25 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:30.622 02:56:25 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:30.623 02:56:25 -- common/autotest_common.sh@363 -- # mount=/ 00:08:30.623 02:56:25 -- common/autotest_common.sh@365 -- # target_space=53077458944 00:08:30.623 02:56:25 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:30.623 02:56:25 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:30.623 02:56:25 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:30.623 02:56:25 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:30.623 02:56:25 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:30.623 02:56:25 -- common/autotest_common.sh@372 -- # new_size=10879451136 00:08:30.623 02:56:25 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:30.623 02:56:25 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:30.623 02:56:25 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:30.623 02:56:25 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:30.623 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:30.623 02:56:25 -- common/autotest_common.sh@380 -- # return 0 00:08:30.623 02:56:25 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:30.623 02:56:25 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:30.623 02:56:25 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:30.623 02:56:25 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:30.623 02:56:25 -- common/autotest_common.sh@1672 -- # true 00:08:30.623 02:56:25 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:30.623 02:56:25 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:30.623 02:56:25 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:30.623 02:56:25 -- common/autotest_common.sh@27 -- # exec 00:08:30.623 02:56:25 -- common/autotest_common.sh@29 -- # exec 00:08:30.623 02:56:25 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:30.623 02:56:25 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:30.623 02:56:25 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:30.623 02:56:25 -- common/autotest_common.sh@18 -- # set -x 00:08:30.623 02:56:25 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:30.623 02:56:25 -- ../common.sh@8 -- # pids=() 00:08:30.623 02:56:25 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:30.623 02:56:25 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:30.623 02:56:25 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:30.623 02:56:25 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:30.623 02:56:25 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:30.623 02:56:25 -- vfio/run.sh@65 -- # mem_size=0 00:08:30.623 02:56:25 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:30.623 02:56:25 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:30.623 02:56:25 -- ../common.sh@69 -- # local fuzz_num=7 00:08:30.623 02:56:25 -- ../common.sh@70 -- # local time=1 00:08:30.623 02:56:25 -- ../common.sh@72 -- # (( i = 0 )) 00:08:30.623 02:56:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.623 02:56:25 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:30.623 02:56:25 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:30.623 02:56:25 -- vfio/run.sh@23 -- # local timen=1 00:08:30.623 02:56:25 -- vfio/run.sh@24 -- # local core=0x1 00:08:30.623 02:56:25 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:30.623 02:56:25 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:30.623 02:56:25 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:30.623 02:56:25 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:30.623 02:56:25 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:30.623 02:56:25 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:30.623 02:56:25 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:30.623 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:30.623 02:56:25 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:30.623 [2024-07-14 02:56:25.852821] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:30.623 [2024-07-14 02:56:25.852897] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid677794 ] 00:08:30.883 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.883 [2024-07-14 02:56:25.923115] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.883 [2024-07-14 02:56:25.959530] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:30.883 [2024-07-14 02:56:25.959673] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.142 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.142 INFO: Seed: 373528468 00:08:31.142 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:31.142 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:31.142 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:31.142 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.142 #2 INITED exec/s: 0 rss: 60Mb 00:08:31.142 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.142 This may also happen if the target rejected all inputs we tried so far 00:08:31.402 NEW_FUNC[1/621]: 0x491220 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:31.402 NEW_FUNC[2/621]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:31.402 #12 NEW cov: 10467 ft: 10715 corp: 2/10b lim: 60 exec/s: 0 rss: 66Mb L: 9/9 MS: 5 CopyPart-CopyPart-EraseBytes-ChangeBit-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.661 NEW_FUNC[1/10]: 0x499620 in write_complete /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:354 00:08:31.661 NEW_FUNC[2/10]: 0x4a7d90 in spdk_bdev_io_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/bdev_module.h:1351 00:08:31.661 #13 NEW cov: 10730 ft: 13020 corp: 3/58b lim: 60 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:31.920 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.920 #14 NEW cov: 10747 ft: 14876 corp: 4/68b lim: 60 exec/s: 0 rss: 68Mb L: 10/48 MS: 1 InsertByte- 00:08:32.179 #15 NEW cov: 10747 ft: 16111 corp: 5/77b lim: 60 exec/s: 15 rss: 68Mb L: 9/48 MS: 1 CopyPart- 00:08:32.179 #16 NEW cov: 10747 ft: 16636 corp: 6/125b lim: 60 exec/s: 16 rss: 70Mb L: 48/48 MS: 1 CrossOver- 00:08:32.437 #17 NEW cov: 10747 ft: 16804 corp: 7/173b lim: 60 exec/s: 17 rss: 70Mb L: 48/48 MS: 1 CMP- DE: "\177\000\000\000\000\000\000\000"- 00:08:32.696 #18 NEW cov: 10747 ft: 17194 corp: 8/189b lim: 60 exec/s: 18 rss: 70Mb L: 16/48 MS: 1 CopyPart- 00:08:32.955 #19 NEW cov: 10754 ft: 17244 corp: 9/218b lim: 60 exec/s: 19 rss: 70Mb L: 29/48 MS: 1 EraseBytes- 00:08:32.955 #20 NEW cov: 10754 ft: 17353 corp: 10/266b lim: 60 exec/s: 20 rss: 70Mb L: 48/48 MS: 1 ChangeBit- 00:08:33.214 #21 NEW cov: 10754 ft: 17586 corp: 11/275b lim: 60 exec/s: 10 rss: 70Mb L: 9/48 MS: 1 ChangeBit- 00:08:33.214 #21 DONE cov: 10754 ft: 17586 corp: 11/275b lim: 60 exec/s: 10 rss: 70Mb 00:08:33.214 ###### Recommended dictionary. ###### 00:08:33.215 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:33.215 "\177\000\000\000\000\000\000\000" # Uses: 0 00:08:33.215 ###### End of recommended dictionary. ###### 00:08:33.215 Done 21 runs in 2 second(s) 00:08:33.474 02:56:28 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:33.474 02:56:28 -- ../common.sh@72 -- # (( i++ )) 00:08:33.474 02:56:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.474 02:56:28 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:33.474 02:56:28 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:33.474 02:56:28 -- vfio/run.sh@23 -- # local timen=1 00:08:33.474 02:56:28 -- vfio/run.sh@24 -- # local core=0x1 00:08:33.474 02:56:28 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:33.474 02:56:28 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:33.474 02:56:28 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:33.474 02:56:28 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:33.474 02:56:28 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:33.474 02:56:28 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:33.474 02:56:28 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:33.474 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:33.474 02:56:28 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:33.474 [2024-07-14 02:56:28.625375] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:33.474 [2024-07-14 02:56:28.625475] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid678337 ] 00:08:33.474 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.474 [2024-07-14 02:56:28.697895] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.734 [2024-07-14 02:56:28.734137] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:33.734 [2024-07-14 02:56:28.734279] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.734 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.734 INFO: Seed: 3142543798 00:08:33.734 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:33.734 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:33.734 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:33.734 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.734 #2 INITED exec/s: 0 rss: 60Mb 00:08:33.734 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.734 This may also happen if the target rejected all inputs we tried so far 00:08:33.993 [2024-07-14 02:56:28.989473] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:33.993 [2024-07-14 02:56:28.989510] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:33.993 [2024-07-14 02:56:28.989529] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:34.251 NEW_FUNC[1/638]: 0x4917c0 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:34.251 NEW_FUNC[2/638]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:34.251 #7 NEW cov: 10728 ft: 10162 corp: 2/30b lim: 40 exec/s: 0 rss: 66Mb L: 29/29 MS: 5 ChangeBinInt-CrossOver-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:34.251 [2024-07-14 02:56:29.390376] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:34.251 [2024-07-14 02:56:29.390413] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:34.251 [2024-07-14 02:56:29.390432] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:34.251 #12 NEW cov: 10745 ft: 12536 corp: 3/61b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 5 ShuffleBytes-CopyPart-ChangeBit-CrossOver-InsertRepeatedBytes- 00:08:34.510 [2024-07-14 02:56:29.515312] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:34.510 [2024-07-14 02:56:29.515340] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:34.510 [2024-07-14 02:56:29.515358] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:34.510 #13 NEW cov: 10745 ft: 13844 corp: 4/90b lim: 40 exec/s: 0 rss: 69Mb L: 29/31 MS: 1 ChangeByte- 00:08:34.510 [2024-07-14 02:56:29.629288] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:34.510 [2024-07-14 02:56:29.629314] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:34.510 [2024-07-14 02:56:29.629333] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:34.510 #19 NEW cov: 10745 ft: 14194 corp: 5/125b lim: 40 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:08:34.510 [2024-07-14 02:56:29.743153] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:34.510 [2024-07-14 02:56:29.743178] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:34.510 [2024-07-14 02:56:29.743197] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:34.769 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:34.769 #20 NEW cov: 10762 ft: 14412 corp: 6/156b lim: 40 exec/s: 0 rss: 69Mb L: 31/35 MS: 1 ShuffleBytes- 00:08:34.769 [2024-07-14 02:56:29.857016] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:34.769 [2024-07-14 02:56:29.857041] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:34.769 [2024-07-14 02:56:29.857059] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:34.769 #24 NEW cov: 10762 ft: 14460 corp: 7/165b lim: 40 exec/s: 24 rss: 69Mb L: 9/35 MS: 4 ChangeByte-CrossOver-ChangeBinInt-CrossOver- 00:08:34.769 [2024-07-14 02:56:29.971721] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:34.769 [2024-07-14 02:56:29.971746] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:34.769 [2024-07-14 02:56:29.971765] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.027 #25 NEW cov: 10762 ft: 14785 corp: 8/195b lim: 40 exec/s: 25 rss: 69Mb L: 30/35 MS: 1 CrossOver- 00:08:35.027 [2024-07-14 02:56:30.086768] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.027 [2024-07-14 02:56:30.086797] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.027 [2024-07-14 02:56:30.086817] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.027 #31 NEW cov: 10762 ft: 15167 corp: 9/224b lim: 40 exec/s: 31 rss: 69Mb L: 29/35 MS: 1 ChangeBit- 00:08:35.027 [2024-07-14 02:56:30.201672] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.027 [2024-07-14 02:56:30.201699] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.027 [2024-07-14 02:56:30.201718] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.027 #32 NEW cov: 10762 ft: 15197 corp: 10/253b lim: 40 exec/s: 32 rss: 69Mb L: 29/35 MS: 1 CMP- DE: "\377\377"- 00:08:35.287 [2024-07-14 02:56:30.317557] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.287 [2024-07-14 02:56:30.317582] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.287 [2024-07-14 02:56:30.317601] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.287 #33 NEW cov: 10762 ft: 15421 corp: 11/263b lim: 40 exec/s: 33 rss: 69Mb L: 10/35 MS: 1 InsertByte- 00:08:35.287 [2024-07-14 02:56:30.432332] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.287 [2024-07-14 02:56:30.432358] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.287 [2024-07-14 02:56:30.432376] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.287 #34 NEW cov: 10762 ft: 15470 corp: 12/292b lim: 40 exec/s: 34 rss: 69Mb L: 29/35 MS: 1 ShuffleBytes- 00:08:35.546 [2024-07-14 02:56:30.549259] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.546 [2024-07-14 02:56:30.549284] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.546 [2024-07-14 02:56:30.549303] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.546 #35 NEW cov: 10762 ft: 15502 corp: 13/322b lim: 40 exec/s: 35 rss: 69Mb L: 30/35 MS: 1 InsertByte- 00:08:35.546 [2024-07-14 02:56:30.666254] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.546 [2024-07-14 02:56:30.666282] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.546 [2024-07-14 02:56:30.666302] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.546 #36 NEW cov: 10762 ft: 15960 corp: 14/353b lim: 40 exec/s: 36 rss: 69Mb L: 31/35 MS: 1 CrossOver- 00:08:35.546 [2024-07-14 02:56:30.781269] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.546 [2024-07-14 02:56:30.781295] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.546 [2024-07-14 02:56:30.781313] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.806 #37 NEW cov: 10769 ft: 15981 corp: 15/383b lim: 40 exec/s: 37 rss: 69Mb L: 30/35 MS: 1 ChangeByte- 00:08:35.806 [2024-07-14 02:56:30.896152] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.806 [2024-07-14 02:56:30.896178] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.806 [2024-07-14 02:56:30.896196] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:35.806 #38 NEW cov: 10769 ft: 16097 corp: 16/414b lim: 40 exec/s: 19 rss: 69Mb L: 31/35 MS: 1 ChangeASCIIInt- 00:08:35.806 #38 DONE cov: 10769 ft: 16097 corp: 16/414b lim: 40 exec/s: 19 rss: 69Mb 00:08:35.806 ###### Recommended dictionary. ###### 00:08:35.806 "\377\377" # Uses: 0 00:08:35.806 ###### End of recommended dictionary. ###### 00:08:35.806 Done 38 runs in 2 second(s) 00:08:36.070 02:56:31 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:36.070 02:56:31 -- ../common.sh@72 -- # (( i++ )) 00:08:36.070 02:56:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.070 02:56:31 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:36.070 02:56:31 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:36.070 02:56:31 -- vfio/run.sh@23 -- # local timen=1 00:08:36.070 02:56:31 -- vfio/run.sh@24 -- # local core=0x1 00:08:36.070 02:56:31 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:36.070 02:56:31 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:36.070 02:56:31 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:36.070 02:56:31 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:36.070 02:56:31 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:36.070 02:56:31 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:36.070 02:56:31 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:36.070 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:36.070 02:56:31 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:36.070 [2024-07-14 02:56:31.245788] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:36.070 [2024-07-14 02:56:31.245840] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid678879 ] 00:08:36.070 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.070 [2024-07-14 02:56:31.313827] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.448 [2024-07-14 02:56:31.350581] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:36.448 [2024-07-14 02:56:31.350729] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.448 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.448 INFO: Seed: 1469576556 00:08:36.448 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:36.448 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:36.448 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:36.448 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.448 #2 INITED exec/s: 0 rss: 60Mb 00:08:36.448 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.448 This may also happen if the target rejected all inputs we tried so far 00:08:36.448 [2024-07-14 02:56:31.607789] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.966 NEW_FUNC[1/635]: 0x4921a0 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:36.966 NEW_FUNC[2/635]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:36.966 #12 NEW cov: 10697 ft: 10663 corp: 2/24b lim: 80 exec/s: 0 rss: 66Mb L: 23/23 MS: 5 ChangeByte-CrossOver-ChangeBit-ChangeBinInt-InsertRepeatedBytes- 00:08:36.966 [2024-07-14 02:56:32.019587] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.966 NEW_FUNC[1/1]: 0x1bc15e0 in accel_process_sequence /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/accel/accel.c:1775 00:08:36.966 #18 NEW cov: 10725 ft: 13828 corp: 3/62b lim: 80 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 CrossOver- 00:08:36.966 [2024-07-14 02:56:32.143501] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.966 #19 NEW cov: 10725 ft: 14166 corp: 4/100b lim: 80 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:37.226 [2024-07-14 02:56:32.267620] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:37.226 #20 NEW cov: 10725 ft: 14704 corp: 5/138b lim: 80 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeByte- 00:08:37.226 [2024-07-14 02:56:32.381536] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:37.226 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:37.226 #21 NEW cov: 10742 ft: 14867 corp: 6/184b lim: 80 exec/s: 0 rss: 69Mb L: 46/46 MS: 1 InsertRepeatedBytes- 00:08:37.485 [2024-07-14 02:56:32.505385] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:37.485 #22 NEW cov: 10742 ft: 15447 corp: 7/259b lim: 80 exec/s: 22 rss: 69Mb L: 75/75 MS: 1 CrossOver- 00:08:37.485 [2024-07-14 02:56:32.619496] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:37.485 #23 NEW cov: 10742 ft: 16173 corp: 8/297b lim: 80 exec/s: 23 rss: 69Mb L: 38/75 MS: 1 ChangeBinInt- 00:08:37.485 [2024-07-14 02:56:32.733323] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:37.743 #24 NEW cov: 10742 ft: 16532 corp: 9/320b lim: 80 exec/s: 24 rss: 69Mb L: 23/75 MS: 1 ChangeBit- 00:08:37.744 [2024-07-14 02:56:32.846139] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:37.744 #25 NEW cov: 10742 ft: 16626 corp: 10/334b lim: 80 exec/s: 25 rss: 69Mb L: 14/75 MS: 1 EraseBytes- 00:08:37.744 [2024-07-14 02:56:32.960000] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:38.002 #26 NEW cov: 10742 ft: 16665 corp: 11/380b lim: 80 exec/s: 26 rss: 69Mb L: 46/75 MS: 1 CrossOver- 00:08:38.002 [2024-07-14 02:56:33.073539] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:38.003 #27 NEW cov: 10742 ft: 16685 corp: 12/455b lim: 80 exec/s: 27 rss: 69Mb L: 75/75 MS: 1 ShuffleBytes- 00:08:38.003 [2024-07-14 02:56:33.187240] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:38.261 #28 NEW cov: 10742 ft: 16769 corp: 13/500b lim: 80 exec/s: 28 rss: 69Mb L: 45/75 MS: 1 CrossOver- 00:08:38.261 [2024-07-14 02:56:33.301036] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:38.261 #29 NEW cov: 10749 ft: 17006 corp: 14/523b lim: 80 exec/s: 29 rss: 70Mb L: 23/75 MS: 1 ChangeByte- 00:08:38.261 [2024-07-14 02:56:33.414900] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:38.261 #32 NEW cov: 10749 ft: 17084 corp: 15/547b lim: 80 exec/s: 32 rss: 70Mb L: 24/75 MS: 3 ChangeBit-ShuffleBytes-CrossOver- 00:08:38.519 [2024-07-14 02:56:33.528229] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:38.519 #33 NEW cov: 10749 ft: 17221 corp: 16/583b lim: 80 exec/s: 16 rss: 70Mb L: 36/75 MS: 1 CrossOver- 00:08:38.519 #33 DONE cov: 10749 ft: 17221 corp: 16/583b lim: 80 exec/s: 16 rss: 70Mb 00:08:38.519 Done 33 runs in 2 second(s) 00:08:38.779 02:56:33 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:38.779 02:56:33 -- ../common.sh@72 -- # (( i++ )) 00:08:38.779 02:56:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.779 02:56:33 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:38.779 02:56:33 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:38.779 02:56:33 -- vfio/run.sh@23 -- # local timen=1 00:08:38.779 02:56:33 -- vfio/run.sh@24 -- # local core=0x1 00:08:38.779 02:56:33 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:38.779 02:56:33 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:38.779 02:56:33 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:38.779 02:56:33 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:38.779 02:56:33 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:38.779 02:56:33 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:38.779 02:56:33 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:38.779 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:38.779 02:56:33 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:38.779 [2024-07-14 02:56:33.885466] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:38.779 [2024-07-14 02:56:33.885537] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid679233 ] 00:08:38.779 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.779 [2024-07-14 02:56:33.956125] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.779 [2024-07-14 02:56:33.992942] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:38.779 [2024-07-14 02:56:33.993082] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.038 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.038 INFO: Seed: 4109860686 00:08:39.038 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:39.038 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:39.038 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:39.038 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.038 #2 INITED exec/s: 0 rss: 60Mb 00:08:39.038 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.038 This may also happen if the target rejected all inputs we tried so far 00:08:39.555 NEW_FUNC[1/632]: 0x492880 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:39.555 NEW_FUNC[2/632]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:39.555 #18 NEW cov: 10696 ft: 10667 corp: 2/58b lim: 320 exec/s: 0 rss: 67Mb L: 57/57 MS: 1 InsertRepeatedBytes- 00:08:39.814 #21 NEW cov: 10713 ft: 13473 corp: 3/183b lim: 320 exec/s: 0 rss: 68Mb L: 125/125 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:39.814 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:39.814 #22 NEW cov: 10730 ft: 14290 corp: 4/274b lim: 320 exec/s: 0 rss: 69Mb L: 91/125 MS: 1 CopyPart- 00:08:40.073 #23 NEW cov: 10730 ft: 15541 corp: 5/399b lim: 320 exec/s: 23 rss: 69Mb L: 125/125 MS: 1 ChangeBinInt- 00:08:40.332 #24 NEW cov: 10730 ft: 16023 corp: 6/524b lim: 320 exec/s: 24 rss: 69Mb L: 125/125 MS: 1 ChangeBinInt- 00:08:40.591 #25 NEW cov: 10730 ft: 16257 corp: 7/649b lim: 320 exec/s: 25 rss: 69Mb L: 125/125 MS: 1 ChangeByte- 00:08:40.591 #26 NEW cov: 10730 ft: 16409 corp: 8/808b lim: 320 exec/s: 26 rss: 69Mb L: 159/159 MS: 1 InsertRepeatedBytes- 00:08:40.850 #27 NEW cov: 10737 ft: 16608 corp: 9/903b lim: 320 exec/s: 27 rss: 69Mb L: 95/159 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:41.109 #28 NEW cov: 10737 ft: 16651 corp: 10/994b lim: 320 exec/s: 14 rss: 70Mb L: 91/159 MS: 1 ChangeBinInt- 00:08:41.109 #28 DONE cov: 10737 ft: 16651 corp: 10/994b lim: 320 exec/s: 14 rss: 70Mb 00:08:41.109 ###### Recommended dictionary. ###### 00:08:41.109 "\000\000\000\000" # Uses: 0 00:08:41.109 ###### End of recommended dictionary. ###### 00:08:41.109 Done 28 runs in 2 second(s) 00:08:41.368 02:56:36 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:41.368 02:56:36 -- ../common.sh@72 -- # (( i++ )) 00:08:41.368 02:56:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.368 02:56:36 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:41.368 02:56:36 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:41.368 02:56:36 -- vfio/run.sh@23 -- # local timen=1 00:08:41.368 02:56:36 -- vfio/run.sh@24 -- # local core=0x1 00:08:41.368 02:56:36 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:41.368 02:56:36 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:41.368 02:56:36 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:41.368 02:56:36 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:41.368 02:56:36 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:41.368 02:56:36 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:41.368 02:56:36 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:41.368 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:41.368 02:56:36 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:41.368 [2024-07-14 02:56:36.478577] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:41.368 [2024-07-14 02:56:36.478647] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid679721 ] 00:08:41.368 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.368 [2024-07-14 02:56:36.545889] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.368 [2024-07-14 02:56:36.581479] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:41.368 [2024-07-14 02:56:36.581617] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.628 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.628 INFO: Seed: 2401612230 00:08:41.628 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:41.628 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:41.628 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:41.628 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.628 #2 INITED exec/s: 0 rss: 59Mb 00:08:41.628 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.628 This may also happen if the target rejected all inputs we tried so far 00:08:42.146 NEW_FUNC[1/631]: 0x493100 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:42.146 NEW_FUNC[2/631]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:42.146 #3 NEW cov: 10701 ft: 10226 corp: 2/88b lim: 320 exec/s: 0 rss: 67Mb L: 87/87 MS: 1 InsertRepeatedBytes- 00:08:42.146 NEW_FUNC[1/1]: 0x1368850 in index_to_sg_t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:669 00:08:42.146 #9 NEW cov: 10716 ft: 13148 corp: 3/175b lim: 320 exec/s: 0 rss: 68Mb L: 87/87 MS: 1 ShuffleBytes- 00:08:42.405 #11 NEW cov: 10719 ft: 13771 corp: 4/209b lim: 320 exec/s: 0 rss: 69Mb L: 34/87 MS: 2 ChangeByte-CrossOver- 00:08:42.405 #12 NEW cov: 10719 ft: 14041 corp: 5/296b lim: 320 exec/s: 0 rss: 69Mb L: 87/87 MS: 1 ChangeByte- 00:08:42.664 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:42.664 #13 NEW cov: 10736 ft: 14566 corp: 6/403b lim: 320 exec/s: 0 rss: 69Mb L: 107/107 MS: 1 InsertRepeatedBytes- 00:08:42.664 #14 NEW cov: 10736 ft: 15333 corp: 7/453b lim: 320 exec/s: 14 rss: 69Mb L: 50/107 MS: 1 EraseBytes- 00:08:42.664 #15 NEW cov: 10736 ft: 15354 corp: 8/560b lim: 320 exec/s: 15 rss: 69Mb L: 107/107 MS: 1 ChangeBinInt- 00:08:42.924 #16 NEW cov: 10736 ft: 15531 corp: 9/611b lim: 320 exec/s: 16 rss: 69Mb L: 51/107 MS: 1 EraseBytes- 00:08:42.924 #17 NEW cov: 10736 ft: 15892 corp: 10/704b lim: 320 exec/s: 17 rss: 69Mb L: 93/107 MS: 1 InsertRepeatedBytes- 00:08:43.183 #18 NEW cov: 10736 ft: 16808 corp: 11/827b lim: 320 exec/s: 18 rss: 69Mb L: 123/123 MS: 1 CrossOver- 00:08:43.183 #19 NEW cov: 10736 ft: 16829 corp: 12/918b lim: 320 exec/s: 19 rss: 69Mb L: 91/123 MS: 1 CopyPart- 00:08:43.441 #20 NEW cov: 10736 ft: 16902 corp: 13/1089b lim: 320 exec/s: 20 rss: 70Mb L: 171/171 MS: 1 InsertRepeatedBytes- 00:08:43.441 #26 NEW cov: 10736 ft: 16984 corp: 14/1176b lim: 320 exec/s: 26 rss: 70Mb L: 87/171 MS: 1 ChangeByte- 00:08:43.441 #27 NEW cov: 10743 ft: 17054 corp: 15/1351b lim: 320 exec/s: 27 rss: 70Mb L: 175/175 MS: 1 InsertRepeatedBytes- 00:08:43.700 #28 NEW cov: 10743 ft: 17201 corp: 16/1401b lim: 320 exec/s: 14 rss: 70Mb L: 50/175 MS: 1 ChangeByte- 00:08:43.700 #28 DONE cov: 10743 ft: 17201 corp: 16/1401b lim: 320 exec/s: 14 rss: 70Mb 00:08:43.700 Done 28 runs in 2 second(s) 00:08:43.959 02:56:39 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:43.959 02:56:39 -- ../common.sh@72 -- # (( i++ )) 00:08:43.959 02:56:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.959 02:56:39 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:43.959 02:56:39 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:43.959 02:56:39 -- vfio/run.sh@23 -- # local timen=1 00:08:43.959 02:56:39 -- vfio/run.sh@24 -- # local core=0x1 00:08:43.959 02:56:39 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:43.959 02:56:39 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:43.959 02:56:39 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:43.959 02:56:39 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:43.959 02:56:39 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:43.959 02:56:39 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:43.959 02:56:39 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:43.959 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:43.959 02:56:39 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:43.959 [2024-07-14 02:56:39.076186] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:43.959 [2024-07-14 02:56:39.076250] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid680266 ] 00:08:43.959 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.959 [2024-07-14 02:56:39.142409] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.959 [2024-07-14 02:56:39.178285] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:43.959 [2024-07-14 02:56:39.178426] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.218 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.218 INFO: Seed: 704625166 00:08:44.218 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:44.218 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:44.218 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:44.218 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.218 #2 INITED exec/s: 0 rss: 59Mb 00:08:44.218 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.218 This may also happen if the target rejected all inputs we tried so far 00:08:44.218 [2024-07-14 02:56:39.454489] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.218 [2024-07-14 02:56:39.454534] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.735 NEW_FUNC[1/638]: 0x493b00 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:44.735 NEW_FUNC[2/638]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:44.735 #13 NEW cov: 10729 ft: 10537 corp: 2/77b lim: 120 exec/s: 0 rss: 66Mb L: 76/76 MS: 1 InsertRepeatedBytes- 00:08:44.735 [2024-07-14 02:56:39.937930] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.735 [2024-07-14 02:56:39.937975] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.995 #15 NEW cov: 10743 ft: 13028 corp: 3/121b lim: 120 exec/s: 0 rss: 67Mb L: 44/76 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:44.995 [2024-07-14 02:56:40.132055] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.995 [2024-07-14 02:56:40.132096] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.995 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.995 #16 NEW cov: 10760 ft: 14749 corp: 4/179b lim: 120 exec/s: 0 rss: 68Mb L: 58/76 MS: 1 EraseBytes- 00:08:45.254 [2024-07-14 02:56:40.317922] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:45.254 [2024-07-14 02:56:40.317954] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.254 #17 NEW cov: 10760 ft: 15620 corp: 5/237b lim: 120 exec/s: 17 rss: 68Mb L: 58/76 MS: 1 CMP- DE: "\001)\355\266\354\215jR"- 00:08:45.254 [2024-07-14 02:56:40.504229] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:45.254 [2024-07-14 02:56:40.504260] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.512 #18 NEW cov: 10760 ft: 16078 corp: 6/289b lim: 120 exec/s: 18 rss: 69Mb L: 52/76 MS: 1 PersAutoDict- DE: "\001)\355\266\354\215jR"- 00:08:45.512 [2024-07-14 02:56:40.690822] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:45.512 [2024-07-14 02:56:40.690852] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.771 #19 NEW cov: 10760 ft: 16519 corp: 7/323b lim: 120 exec/s: 19 rss: 69Mb L: 34/76 MS: 1 EraseBytes- 00:08:45.771 [2024-07-14 02:56:40.875226] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:45.771 [2024-07-14 02:56:40.875255] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.771 #20 NEW cov: 10760 ft: 16598 corp: 8/357b lim: 120 exec/s: 20 rss: 69Mb L: 34/76 MS: 1 ChangeByte- 00:08:46.028 [2024-07-14 02:56:41.059129] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.028 [2024-07-14 02:56:41.059161] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.028 #21 NEW cov: 10760 ft: 16851 corp: 9/409b lim: 120 exec/s: 21 rss: 69Mb L: 52/76 MS: 1 PersAutoDict- DE: "\001)\355\266\354\215jR"- 00:08:46.028 [2024-07-14 02:56:41.242947] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.028 [2024-07-14 02:56:41.242976] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.314 #22 NEW cov: 10767 ft: 17141 corp: 10/453b lim: 120 exec/s: 22 rss: 69Mb L: 44/76 MS: 1 PersAutoDict- DE: "\001)\355\266\354\215jR"- 00:08:46.314 [2024-07-14 02:56:41.428106] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.314 [2024-07-14 02:56:41.428136] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.314 #23 NEW cov: 10767 ft: 17330 corp: 11/529b lim: 120 exec/s: 11 rss: 69Mb L: 76/76 MS: 1 CopyPart- 00:08:46.314 #23 DONE cov: 10767 ft: 17330 corp: 11/529b lim: 120 exec/s: 11 rss: 69Mb 00:08:46.314 ###### Recommended dictionary. ###### 00:08:46.314 "\001)\355\266\354\215jR" # Uses: 3 00:08:46.314 ###### End of recommended dictionary. ###### 00:08:46.314 Done 23 runs in 2 second(s) 00:08:46.573 02:56:41 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:46.573 02:56:41 -- ../common.sh@72 -- # (( i++ )) 00:08:46.573 02:56:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.573 02:56:41 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:46.573 02:56:41 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:46.573 02:56:41 -- vfio/run.sh@23 -- # local timen=1 00:08:46.573 02:56:41 -- vfio/run.sh@24 -- # local core=0x1 00:08:46.573 02:56:41 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:46.573 02:56:41 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:46.573 02:56:41 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:46.573 02:56:41 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:46.573 02:56:41 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:46.573 02:56:41 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:46.573 02:56:41 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:46.573 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:46.573 02:56:41 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:46.831 [2024-07-14 02:56:41.830764] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:46.831 [2024-07-14 02:56:41.830835] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid680734 ] 00:08:46.831 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.831 [2024-07-14 02:56:41.901631] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.831 [2024-07-14 02:56:41.938286] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:46.831 [2024-07-14 02:56:41.938434] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.089 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.089 INFO: Seed: 3459630070 00:08:47.089 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:47.089 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:47.089 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:47.089 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.089 #2 INITED exec/s: 0 rss: 59Mb 00:08:47.089 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.089 This may also happen if the target rejected all inputs we tried so far 00:08:47.089 [2024-07-14 02:56:42.219477] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.089 [2024-07-14 02:56:42.219518] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.606 NEW_FUNC[1/638]: 0x4947f0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:47.606 NEW_FUNC[2/638]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:47.606 #9 NEW cov: 10725 ft: 10544 corp: 2/37b lim: 90 exec/s: 0 rss: 67Mb L: 36/36 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:47.606 [2024-07-14 02:56:42.684494] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.606 [2024-07-14 02:56:42.684537] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.606 #16 NEW cov: 10739 ft: 13489 corp: 3/114b lim: 90 exec/s: 0 rss: 68Mb L: 77/77 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:47.865 [2024-07-14 02:56:42.868172] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.865 [2024-07-14 02:56:42.868207] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.865 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:47.865 #21 NEW cov: 10756 ft: 15341 corp: 4/192b lim: 90 exec/s: 0 rss: 69Mb L: 78/78 MS: 5 ShuffleBytes-ChangeBinInt-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:47.865 [2024-07-14 02:56:43.051083] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.865 [2024-07-14 02:56:43.051114] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.123 #22 NEW cov: 10756 ft: 15685 corp: 5/244b lim: 90 exec/s: 22 rss: 69Mb L: 52/78 MS: 1 InsertRepeatedBytes- 00:08:48.123 [2024-07-14 02:56:43.224754] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.123 [2024-07-14 02:56:43.224785] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.123 #23 NEW cov: 10756 ft: 16505 corp: 6/281b lim: 90 exec/s: 23 rss: 69Mb L: 37/78 MS: 1 InsertByte- 00:08:48.381 [2024-07-14 02:56:43.398407] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.381 [2024-07-14 02:56:43.398438] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.381 #24 NEW cov: 10756 ft: 16887 corp: 7/318b lim: 90 exec/s: 24 rss: 69Mb L: 37/78 MS: 1 CrossOver- 00:08:48.381 [2024-07-14 02:56:43.573205] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.381 [2024-07-14 02:56:43.573234] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.640 #25 NEW cov: 10756 ft: 16977 corp: 8/354b lim: 90 exec/s: 25 rss: 69Mb L: 36/78 MS: 1 ChangeByte- 00:08:48.640 [2024-07-14 02:56:43.747606] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.640 [2024-07-14 02:56:43.747636] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.640 #26 NEW cov: 10756 ft: 17028 corp: 9/439b lim: 90 exec/s: 26 rss: 69Mb L: 85/85 MS: 1 CopyPart- 00:08:48.900 [2024-07-14 02:56:43.922109] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.900 [2024-07-14 02:56:43.922137] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.900 #27 NEW cov: 10763 ft: 17149 corp: 10/476b lim: 90 exec/s: 27 rss: 69Mb L: 37/85 MS: 1 InsertByte- 00:08:48.900 [2024-07-14 02:56:44.095733] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.900 [2024-07-14 02:56:44.095765] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:49.158 #28 NEW cov: 10763 ft: 17463 corp: 11/512b lim: 90 exec/s: 14 rss: 69Mb L: 36/85 MS: 1 ChangeBit- 00:08:49.158 #28 DONE cov: 10763 ft: 17463 corp: 11/512b lim: 90 exec/s: 14 rss: 69Mb 00:08:49.158 Done 28 runs in 2 second(s) 00:08:49.417 02:56:44 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:49.417 02:56:44 -- ../common.sh@72 -- # (( i++ )) 00:08:49.417 02:56:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.417 02:56:44 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:49.417 00:08:49.417 real 0m18.848s 00:08:49.417 user 0m26.101s 00:08:49.417 sys 0m1.737s 00:08:49.417 02:56:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:49.417 02:56:44 -- common/autotest_common.sh@10 -- # set +x 00:08:49.417 ************************************ 00:08:49.417 END TEST vfio_fuzz 00:08:49.417 ************************************ 00:08:49.417 02:56:44 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:49.417 00:08:49.417 real 1m21.906s 00:08:49.417 user 2m4.717s 00:08:49.417 sys 0m9.611s 00:08:49.417 02:56:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:49.417 02:56:44 -- common/autotest_common.sh@10 -- # set +x 00:08:49.417 ************************************ 00:08:49.417 END TEST llvm_fuzz 00:08:49.417 ************************************ 00:08:49.417 02:56:44 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:08:49.417 02:56:44 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:08:49.417 02:56:44 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:08:49.417 02:56:44 -- common/autotest_common.sh@712 -- # xtrace_disable 00:08:49.417 02:56:44 -- common/autotest_common.sh@10 -- # set +x 00:08:49.417 02:56:44 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:08:49.417 02:56:44 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:08:49.417 02:56:44 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:08:49.417 02:56:44 -- common/autotest_common.sh@10 -- # set +x 00:08:55.990 INFO: APP EXITING 00:08:55.990 INFO: killing all VMs 00:08:55.990 INFO: killing vhost app 00:08:55.990 INFO: EXIT DONE 00:08:58.528 Waiting for block devices as requested 00:08:58.528 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:58.528 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:58.785 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:58.785 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:58.785 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:59.043 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:59.044 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:59.044 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:59.044 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:59.302 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:59.302 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:59.302 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:59.560 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:59.560 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:59.560 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:59.819 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:59.819 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:03.111 Cleaning 00:09:03.111 Removing: /dev/shm/spdk_tgt_trace.pid644078 00:09:03.111 Removing: /var/run/dpdk/spdk_pid641596 00:09:03.111 Removing: /var/run/dpdk/spdk_pid642866 00:09:03.111 Removing: /var/run/dpdk/spdk_pid644078 00:09:03.111 Removing: /var/run/dpdk/spdk_pid644837 00:09:03.111 Removing: /var/run/dpdk/spdk_pid645122 00:09:03.111 Removing: /var/run/dpdk/spdk_pid645437 00:09:03.111 Removing: /var/run/dpdk/spdk_pid645812 00:09:03.111 Removing: /var/run/dpdk/spdk_pid646084 00:09:03.111 Removing: /var/run/dpdk/spdk_pid646218 00:09:03.111 Removing: /var/run/dpdk/spdk_pid646481 00:09:03.111 Removing: /var/run/dpdk/spdk_pid646796 00:09:03.111 Removing: /var/run/dpdk/spdk_pid647652 00:09:03.111 Removing: /var/run/dpdk/spdk_pid650846 00:09:03.112 Removing: /var/run/dpdk/spdk_pid651481 00:09:03.112 Removing: /var/run/dpdk/spdk_pid651969 00:09:03.112 Removing: /var/run/dpdk/spdk_pid652018 00:09:03.112 Removing: /var/run/dpdk/spdk_pid652596 00:09:03.112 Removing: /var/run/dpdk/spdk_pid652824 00:09:03.112 Removing: /var/run/dpdk/spdk_pid653173 00:09:03.112 Removing: /var/run/dpdk/spdk_pid653437 00:09:03.112 Removing: /var/run/dpdk/spdk_pid653746 00:09:03.112 Removing: /var/run/dpdk/spdk_pid653791 00:09:03.112 Removing: /var/run/dpdk/spdk_pid654046 00:09:03.112 Removing: /var/run/dpdk/spdk_pid654289 00:09:03.112 Removing: /var/run/dpdk/spdk_pid654690 00:09:03.112 Removing: /var/run/dpdk/spdk_pid654980 00:09:03.112 Removing: /var/run/dpdk/spdk_pid655233 00:09:03.112 Removing: /var/run/dpdk/spdk_pid655332 00:09:03.112 Removing: /var/run/dpdk/spdk_pid655632 00:09:03.371 Removing: /var/run/dpdk/spdk_pid655652 00:09:03.371 Removing: /var/run/dpdk/spdk_pid655847 00:09:03.371 Removing: /var/run/dpdk/spdk_pid655995 00:09:03.371 Removing: /var/run/dpdk/spdk_pid656269 00:09:03.371 Removing: /var/run/dpdk/spdk_pid656538 00:09:03.371 Removing: /var/run/dpdk/spdk_pid656827 00:09:03.371 Removing: /var/run/dpdk/spdk_pid657093 00:09:03.371 Removing: /var/run/dpdk/spdk_pid657287 00:09:03.371 Removing: /var/run/dpdk/spdk_pid657426 00:09:03.371 Removing: /var/run/dpdk/spdk_pid657689 00:09:03.371 Removing: /var/run/dpdk/spdk_pid657955 00:09:03.372 Removing: /var/run/dpdk/spdk_pid658236 00:09:03.372 Removing: /var/run/dpdk/spdk_pid658510 00:09:03.372 Removing: /var/run/dpdk/spdk_pid658744 00:09:03.372 Removing: /var/run/dpdk/spdk_pid658885 00:09:03.372 Removing: /var/run/dpdk/spdk_pid659093 00:09:03.372 Removing: /var/run/dpdk/spdk_pid659371 00:09:03.372 Removing: /var/run/dpdk/spdk_pid659653 00:09:03.372 Removing: /var/run/dpdk/spdk_pid659919 00:09:03.372 Removing: /var/run/dpdk/spdk_pid660206 00:09:03.372 Removing: /var/run/dpdk/spdk_pid660364 00:09:03.372 Removing: /var/run/dpdk/spdk_pid660533 00:09:03.372 Removing: /var/run/dpdk/spdk_pid660780 00:09:03.372 Removing: /var/run/dpdk/spdk_pid661071 00:09:03.372 Removing: /var/run/dpdk/spdk_pid661340 00:09:03.372 Removing: /var/run/dpdk/spdk_pid661621 00:09:03.372 Removing: /var/run/dpdk/spdk_pid661823 00:09:03.372 Removing: /var/run/dpdk/spdk_pid662007 00:09:03.372 Removing: /var/run/dpdk/spdk_pid662201 00:09:03.372 Removing: /var/run/dpdk/spdk_pid662482 00:09:03.372 Removing: /var/run/dpdk/spdk_pid662750 00:09:03.372 Removing: /var/run/dpdk/spdk_pid663042 00:09:03.372 Removing: /var/run/dpdk/spdk_pid663308 00:09:03.372 Removing: /var/run/dpdk/spdk_pid663527 00:09:03.372 Removing: /var/run/dpdk/spdk_pid663680 00:09:03.372 Removing: /var/run/dpdk/spdk_pid663905 00:09:03.372 Removing: /var/run/dpdk/spdk_pid664175 00:09:03.372 Removing: /var/run/dpdk/spdk_pid664465 00:09:03.372 Removing: /var/run/dpdk/spdk_pid664738 00:09:03.372 Removing: /var/run/dpdk/spdk_pid665021 00:09:03.372 Removing: /var/run/dpdk/spdk_pid665215 00:09:03.372 Removing: /var/run/dpdk/spdk_pid665422 00:09:03.372 Removing: /var/run/dpdk/spdk_pid665644 00:09:03.372 Removing: /var/run/dpdk/spdk_pid665758 00:09:03.372 Removing: /var/run/dpdk/spdk_pid666435 00:09:03.372 Removing: /var/run/dpdk/spdk_pid666922 00:09:03.372 Removing: /var/run/dpdk/spdk_pid667276 00:09:03.372 Removing: /var/run/dpdk/spdk_pid667819 00:09:03.372 Removing: /var/run/dpdk/spdk_pid668323 00:09:03.372 Removing: /var/run/dpdk/spdk_pid668656 00:09:03.372 Removing: /var/run/dpdk/spdk_pid669194 00:09:03.372 Removing: /var/run/dpdk/spdk_pid669737 00:09:03.372 Removing: /var/run/dpdk/spdk_pid670033 00:09:03.372 Removing: /var/run/dpdk/spdk_pid670570 00:09:03.372 Removing: /var/run/dpdk/spdk_pid671102 00:09:03.372 Removing: /var/run/dpdk/spdk_pid671411 00:09:03.372 Removing: /var/run/dpdk/spdk_pid671948 00:09:03.372 Removing: /var/run/dpdk/spdk_pid672341 00:09:03.372 Removing: /var/run/dpdk/spdk_pid672775 00:09:03.372 Removing: /var/run/dpdk/spdk_pid673314 00:09:03.372 Removing: /var/run/dpdk/spdk_pid673619 00:09:03.631 Removing: /var/run/dpdk/spdk_pid674150 00:09:03.631 Removing: /var/run/dpdk/spdk_pid674621 00:09:03.631 Removing: /var/run/dpdk/spdk_pid674982 00:09:03.631 Removing: /var/run/dpdk/spdk_pid675520 00:09:03.631 Removing: /var/run/dpdk/spdk_pid675899 00:09:03.631 Removing: /var/run/dpdk/spdk_pid676354 00:09:03.631 Removing: /var/run/dpdk/spdk_pid676897 00:09:03.631 Removing: /var/run/dpdk/spdk_pid677238 00:09:03.631 Removing: /var/run/dpdk/spdk_pid677794 00:09:03.631 Removing: /var/run/dpdk/spdk_pid678337 00:09:03.631 Removing: /var/run/dpdk/spdk_pid678879 00:09:03.631 Removing: /var/run/dpdk/spdk_pid679233 00:09:03.631 Removing: /var/run/dpdk/spdk_pid679721 00:09:03.631 Removing: /var/run/dpdk/spdk_pid680266 00:09:03.631 Removing: /var/run/dpdk/spdk_pid680734 00:09:03.631 Clean 00:09:03.631 killing process with pid 597945 00:09:06.998 killing process with pid 597942 00:09:06.998 killing process with pid 597944 00:09:07.257 killing process with pid 597943 00:09:07.257 02:57:02 -- common/autotest_common.sh@1436 -- # return 0 00:09:07.257 02:57:02 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:09:07.257 02:57:02 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:07.257 02:57:02 -- common/autotest_common.sh@10 -- # set +x 00:09:07.257 02:57:02 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:09:07.257 02:57:02 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:07.257 02:57:02 -- common/autotest_common.sh@10 -- # set +x 00:09:07.257 02:57:02 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:07.257 02:57:02 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:07.257 02:57:02 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:07.257 02:57:02 -- spdk/autotest.sh@394 -- # hash lcov 00:09:07.257 02:57:02 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:07.257 02:57:02 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:07.257 02:57:02 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:07.257 02:57:02 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:07.257 02:57:02 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:07.257 02:57:02 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:07.257 02:57:02 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:07.257 02:57:02 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:07.257 02:57:02 -- paths/export.sh@5 -- $ export PATH 00:09:07.257 02:57:02 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:07.257 02:57:02 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:07.257 02:57:02 -- common/autobuild_common.sh@435 -- $ date +%s 00:09:07.257 02:57:02 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1720918622.XXXXXX 00:09:07.257 02:57:02 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1720918622.WQ9GV0 00:09:07.257 02:57:02 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:09:07.257 02:57:02 -- common/autobuild_common.sh@441 -- $ '[' -n v22.11.4 ']' 00:09:07.257 02:57:02 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:07.257 02:57:02 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:07.257 02:57:02 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:07.257 02:57:02 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:07.257 02:57:02 -- common/autobuild_common.sh@451 -- $ get_config_params 00:09:07.257 02:57:02 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:09:07.257 02:57:02 -- common/autotest_common.sh@10 -- $ set +x 00:09:07.516 02:57:02 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:07.516 02:57:02 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:07.516 02:57:02 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:07.516 02:57:02 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:07.516 02:57:02 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:07.516 02:57:02 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:07.516 02:57:02 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:07.516 02:57:02 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:07.516 02:57:02 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:07.516 02:57:02 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:07.516 02:57:02 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:07.516 + [[ -n 542086 ]] 00:09:07.516 + sudo kill 542086 00:09:07.525 [Pipeline] } 00:09:07.543 [Pipeline] // stage 00:09:07.549 [Pipeline] } 00:09:07.567 [Pipeline] // timeout 00:09:07.572 [Pipeline] } 00:09:07.590 [Pipeline] // catchError 00:09:07.595 [Pipeline] } 00:09:07.613 [Pipeline] // wrap 00:09:07.619 [Pipeline] } 00:09:07.635 [Pipeline] // catchError 00:09:07.644 [Pipeline] stage 00:09:07.647 [Pipeline] { (Epilogue) 00:09:07.661 [Pipeline] catchError 00:09:07.663 [Pipeline] { 00:09:07.677 [Pipeline] echo 00:09:07.679 Cleanup processes 00:09:07.708 [Pipeline] sh 00:09:07.992 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:07.992 689738 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:08.006 [Pipeline] sh 00:09:08.288 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:08.288 ++ grep -v 'sudo pgrep' 00:09:08.288 ++ awk '{print $1}' 00:09:08.288 + sudo kill -9 00:09:08.288 + true 00:09:08.299 [Pipeline] sh 00:09:08.580 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:08.580 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:08.581 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:09.528 [Pipeline] sh 00:09:09.810 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:09.810 Artifacts sizes are good 00:09:09.824 [Pipeline] archiveArtifacts 00:09:09.831 Archiving artifacts 00:09:09.888 [Pipeline] sh 00:09:10.173 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:10.187 [Pipeline] cleanWs 00:09:10.196 [WS-CLEANUP] Deleting project workspace... 00:09:10.197 [WS-CLEANUP] Deferred wipeout is used... 00:09:10.203 [WS-CLEANUP] done 00:09:10.204 [Pipeline] } 00:09:10.224 [Pipeline] // catchError 00:09:10.235 [Pipeline] sh 00:09:10.515 + logger -p user.info -t JENKINS-CI 00:09:10.525 [Pipeline] } 00:09:10.542 [Pipeline] // stage 00:09:10.548 [Pipeline] } 00:09:10.562 [Pipeline] // node 00:09:10.568 [Pipeline] End of Pipeline 00:09:10.597 Finished: SUCCESS